CN113573118B - Video picture rotating method and display equipment - Google Patents

Video picture rotating method and display equipment Download PDF

Info

Publication number
CN113573118B
CN113573118B CN202010350580.2A CN202010350580A CN113573118B CN 113573118 B CN113573118 B CN 113573118B CN 202010350580 A CN202010350580 A CN 202010350580A CN 113573118 B CN113573118 B CN 113573118B
Authority
CN
China
Prior art keywords
display
image
rotation
video
picture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010350580.2A
Other languages
Chinese (zh)
Other versions
CN113573118A (en
Inventor
李斌
刘儒茜
刘丽英
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Priority to CN202010350580.2A priority Critical patent/CN113573118B/en
Priority to PCT/CN2021/080552 priority patent/WO2021180223A1/en
Priority to PCT/CN2021/080553 priority patent/WO2021180224A1/en
Publication of CN113573118A publication Critical patent/CN113573118A/en
Application granted granted Critical
Publication of CN113573118B publication Critical patent/CN113573118B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/426Internal components of the client ; Characteristics thereof
    • H04N21/42607Internal components of the client ; Characteristics thereof for processing the incoming bitstream
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F16ENGINEERING ELEMENTS AND UNITS; GENERAL MEASURES FOR PRODUCING AND MAINTAINING EFFECTIVE FUNCTIONING OF MACHINES OR INSTALLATIONS; THERMAL INSULATION IN GENERAL
    • F16MFRAMES, CASINGS OR BEDS OF ENGINES, MACHINES OR APPARATUS, NOT SPECIFIC TO ENGINES, MACHINES OR APPARATUS PROVIDED FOR ELSEWHERE; STANDS; SUPPORTS
    • F16M11/00Stands or trestles as supports for apparatus or articles placed thereon ; Stands for scientific apparatus such as gravitational force meters
    • F16M11/02Heads
    • F16M11/04Means for attachment of apparatus; Means allowing adjustment of the apparatus relatively to the stand
    • F16M11/06Means for attachment of apparatus; Means allowing adjustment of the apparatus relatively to the stand allowing pivoting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440218Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by transcoding between formats or standards, e.g. from MPEG-2 to MPEG-4
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440245Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display the reformatting operation being performed only on part of the stream, e.g. a region of the image or a time segment
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440263Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the spatial resolution, e.g. for displaying on a connected PDA
    • H04N21/440272Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the spatial resolution, e.g. for displaying on a connected PDA for performing aspect ratio conversion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440281Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the temporal resolution, e.g. by frame skipping

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The application provides a video picture rotating method and display equipment. And then, an intermediate image is obtained by cutting the output image after the image quality processing, the intermediate image is rotated through an OSD layer to generate a rotated image, and finally, the rotated image is displayed frame by frame to display a video picture. The method intercepts frame data which are subjected to image quality processing from a video layer to an OSD layer for rotation processing, realizes dynamic rotation of a video image at any angle in the video playing process, performs corresponding image quality processing on the video image, and improves the image display effect.

Description

Video picture rotating method and display equipment
Technical Field
The application relates to the technical field of smart televisions, in particular to a video picture rotating method and display equipment.
Background
The intelligent television equipment has an independent operating system and supports function expansion. Various application programs can be installed in the smart television according to the needs of the user, for example, social applications such as traditional video applications and short videos, and reading applications such as cartoons and books. The applications can display application pictures by utilizing a screen of the intelligent television, and rich media resources are provided for the intelligent television. Meanwhile, the intelligent television can also perform data interaction and resource sharing with different terminals. For example, the smart television can be connected with a mobile phone through a wireless communication mode such as a local area network and bluetooth, so as to play resources in the mobile phone or directly project a screen to display a picture on the mobile phone.
However, since the picture scales corresponding to different applications or media assets from different sources are different, the smart tv is often used to display pictures with different scales from the traditional video. For example, video resources shot by a terminal such as a mobile phone are generally vertical media resources with aspect ratios of 9:16, 9:18, 3:4 and the like; and the pictures provided by the reading application are vertical resources similar to the aspect ratio of the book. The aspect ratio of the display screen of the intelligent television is generally in a transverse state of 16:9, 16:10 and the like, so when vertical media such as short videos, cartoons and the like are displayed through the intelligent television, vertical media pictures cannot be normally displayed due to the fact that the picture ratio is not matched with the display screen ratio. Generally, the vertical media asset images need to be zoomed to be displayed completely, which not only wastes the display space on the screen, but also brings bad user experience.
Disclosure of Invention
The application provides a video image rotation method and display equipment, which are used for solving the problem that the traditional image rotation method cannot rotate any angle and cannot process image quality.
In one aspect, the present application provides a video image rotation method, including:
acquiring a video picture to be displayed;
performing image quality processing on the video picture to be displayed through a video layer;
generating an intermediate image; the intermediate image is video frame data obtained by intercepting an output image after the image quality of the video picture to be displayed is processed to output the image;
rotating the intermediate image through an OSD layer to generate a rotated image;
displaying the rotated image frame by frame.
In view of the foregoing technical solutions, a first aspect of the present application provides a video image rotation method, where after a video image to be displayed is obtained, image quality processing may be performed on a video layer. And then, an intermediate image is obtained by cutting out the output image after the image quality processing, and the intermediate image is rotated through an OSD layer to generate a rotated image. And finally, displaying the rotating image frame by frame so as to realize the dynamic rotation of any angle on the video image in the video playing process, and simultaneously carrying out corresponding image quality processing on the video image so as to improve the image display effect.
In another aspect, the present application also provides a display device, including: a display, a rotating assembly, and a controller. Wherein the rotating component is configured to rotate the display to enable the display to be in one of a plurality of rotating states; the controller is configured to perform the following program steps:
acquiring a video picture to be displayed;
performing image quality processing on the video picture to be displayed through a video layer;
generating an intermediate image; the intermediate image is video frame data obtained by intercepting an output image after the image quality of the video picture to be displayed is processed to output the image;
rotating the intermediate image through an OSD layer to generate a rotated image;
controlling the display to display the rotated image frame by frame.
As can be seen from the above technical solutions, a second aspect of the present application provides a display device, including: a display, a rotating assembly, and a controller. Wherein, the controller can control the rotating assembly to rotate to drive the display to rotate by a preset angle. In addition, in the process of controlling the rotation of the display, the rotation and the image quality processing of the display video picture can be synchronously performed. The controller can intercept frame data which is subjected to image quality processing from the video layer to the OSD layer for rotation processing, so that the image quality processing of a video picture to be displayed can be realized, and the video can be rotated at any angle.
Drawings
In order to more clearly explain the technical solution of the present application, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious to those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
Fig. 1A is an application scenario diagram of a display device according to the present application;
FIG. 1B is a rear view of a display device of the present application;
fig. 2 is a block diagram of a hardware configuration of a control apparatus according to the present application;
FIG. 3 is a block diagram of a hardware configuration of a display device according to the present application;
FIG. 4 is a block diagram of an architectural configuration of an operating system in a memory of a display device according to the present application;
FIG. 5A is a schematic view of a horizontal screen of a display device according to the present application;
FIG. 5B is a schematic diagram of a vertical screen state of the display device of the present application;
FIG. 6 is a scene diagram illustrating a video frame rotation method according to the present application;
FIG. 7 is a schematic flow chart illustrating a video frame rotation method according to the present application;
FIG. 8 is a schematic flow chart illustrating the process of acquiring a video frame to be displayed according to the present application;
FIG. 9 is a schematic flow chart of the present application for generating an intermediate image;
FIG. 10 is a schematic flow chart of the present application for generating a rotated image;
FIG. 11 is a schematic view illustrating a process of acquiring display rotation information by invoking a rotation service according to the present application;
FIG. 12 is a schematic flow chart of a listening angle callback interface;
FIG. 13 is a schematic view of a rotating display screen according to the present application;
fig. 14 is a schematic flowchart illustrating scaling of a rotated image according to a current boundary position according to the present application.
Detailed Description
Reference will now be made in detail to embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following examples do not represent all embodiments consistent with the present application. But merely as exemplifications of systems and methods consistent with certain aspects of the application, as recited in the claims.
In order to facilitate a user to display a video picture in different horizontal and vertical screen states of a display, embodiments of the present application provide a video picture rotation display method and a display device suitable for the video picture rotation display method, where the display device is, for example, a rotating television. It should be noted that the method provided in this embodiment is not only applicable to the rotating television, but also applicable to other display devices, such as a computer, a tablet computer, and the like.
The term "module," as used in various embodiments of the present application, may refer to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the functionality associated with that element.
The term "remote control" as used in the embodiments of the present application refers to a component of an electronic device (such as the display device disclosed in the present application) that is capable of wirelessly controlling the electronic device, typically over a short distance. The component may typically be connected to the electronic device using infrared and/or Radio Frequency (RF) signals and/or bluetooth, and may also include functional modules such as WiFi, wireless USB, bluetooth, motion sensors, etc. For example: the hand-held touch remote controller replaces most of the physical built-in hard keys in the common remote control device with the user interface in the touch screen.
The term "gesture" as used in the embodiments of the present application refers to a user behavior used to express an intended idea, action, purpose, or result through a change in hand shape or an action such as hand movement.
The term "hardware system" used in the embodiments of the present application may refer to a physical component having computing, controlling, storing, inputting and outputting functions, which is formed by a mechanical, optical, electrical and magnetic device such as an Integrated Circuit (IC), a Printed Circuit Board (PCB) and the like. In various embodiments of the present application, a hardware system may also be generally referred to as a motherboard (motherboard) or a host chip or controller.
Referring to fig. 1A, an application scenario diagram of a display device according to some embodiments of the present application is provided. As shown in fig. 1A, the control apparatus 100 and the display device 200 may communicate with each other in a wired or wireless manner.
Among them, the control apparatus 100 is configured to control the display device 200, which may receive an operation instruction input by a user and convert the operation instruction into an instruction recognizable and responsive by the display device 200, serving as an intermediary for interaction between the user and the display device 200. Such as: the user operates the channel up/down key on the control device 100, and the display device 200 responds to the channel up/down operation.
The control device 100 may be a remote controller 100A, which includes infrared protocol communication or bluetooth protocol communication, and other short-distance communication methods, etc. to control the display apparatus 200 in a wireless or other wired manner. The user may input a user instruction through a key on a remote controller, voice input, control panel input, etc., to control the display apparatus 200. Such as: the user can input a corresponding control command through a volume up/down key, a channel control key, up/down/left/right moving keys, a voice input key, a menu key, a power on/off key, etc. on the remote controller, to implement the function of controlling the display device 200.
The control device 100 may also be an intelligent device, such as a mobile terminal 100B, a tablet computer, a notebook computer, and the like. For example, the display device 200 is controlled using an application program running on the smart device. The application program may provide various controls to a user through an intuitive User Interface (UI) on a screen associated with the smart device through configuration.
For example, the mobile terminal 100B may install a software application with the display device 200 to implement connection communication through a network communication protocol for the purpose of one-to-one control operation and data communication. Such as: the mobile terminal 100B may be caused to establish a control instruction protocol with the display device 200, and the functions of the physical keys as arranged by the remote control 100A may be implemented by operating various function keys or virtual controls of the user interface provided on the mobile terminal 100B. The audio and video content displayed on the mobile terminal 100B may also be transmitted to the display device 200, so as to implement a synchronous display function.
The display apparatus 200 may provide a network television function of a broadcast receiving function and a computer support function. The display device may be implemented as a digital television, a web television, an Internet Protocol Television (IPTV), or the like.
The display device 200 may be a liquid crystal display, an organic light emitting display, a projection device. The specific display device type, size, resolution, etc. are not limited.
The display apparatus 200 also performs data communication with the server 300 through various communication means. Here, the display apparatus 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 300 may provide various contents and interactions to the display apparatus 200. By way of example, the display device 200 may send and receive information such as: receiving Electronic Program Guide (EPG) data, receiving software program updates, or accessing a remotely stored digital media library. The servers 300 may be a group or groups of servers, and may be one or more types of servers. Other web service contents such as a video on demand and an advertisement service are provided through the server 300.
In some embodiments, as shown in FIG. 1B, display device 200 includes a rotation assembly 276, a controller 250, a display 275, a terminal interface 278 extending from the gap in the backplane, and a rotation assembly 276 coupled to the backplane, the rotation assembly 276 configured to rotate the display 275. From the perspective of the front view of the display device, the rotating component 276 can rotate the display screen to a vertical screen state, that is, a state where the vertical side length of the screen is greater than the horizontal side length, or to a horizontal screen state, that is, a state where the horizontal side length of the screen is greater than the vertical side length.
Fig. 2 is a block diagram illustrating the configuration of the control device 100. As shown in fig. 2, the control device 100 includes a controller 110, a memory 120, a communicator 130, a user input interface 140, a user output interface 150, and a power supply 160.
The controller 110 includes a Random Access Memory (RAM)111, a Read Only Memory (ROM)112, a processor 113, a communication interface, and a communication bus. The controller 110 is used to control the operation of the control device 100, as well as the internal components of the communication cooperation, external and internal data processing functions.
Illustratively, when an interaction of a user pressing a key disposed on the remote controller 100A or an interaction of touching a touch panel disposed on the remote controller 100A is detected, the controller 110 may control to generate a signal corresponding to the detected interaction and transmit the signal to the display device 200.
And a memory 120 for storing various operation programs, data and applications for driving and controlling the control apparatus 100 under the control of the controller 110. The memory 120 may store various control signal commands input by a user.
The communicator 130 enables communication of control signals and data signals with the display apparatus 200 under the control of the controller 110. Such as: the control apparatus 100 transmits a control signal (e.g., a touch signal or a control signal) to the display device 200 via the communicator 130, and the control apparatus 100 may receive the signal transmitted by the display device 200 via the communicator 130. The communicator 130 may include an infrared signal interface 131 and a radio frequency signal interface 132. For example: when the infrared signal interface is used, the user input instruction needs to be converted into an infrared control signal according to an infrared control protocol, and the infrared control signal is sent to the display device 200 through the infrared sending module. The following steps are repeated: when the rf signal interface is used, a user input command needs to be converted into a digital signal, and then the digital signal is modulated according to the rf control signal modulation protocol and then transmitted to the display device 200 through the rf transmitting terminal.
The user input interface 140 may include at least one of a microphone 141, a touch pad 142, a sensor 143, a key 144, and the like, so that a user can input a user instruction regarding controlling the display apparatus 200 to the control apparatus 100 through voice, touch, gesture, press, and the like.
The user output interface 150 outputs a user instruction received by the user input interface 140 to the display apparatus 200, or outputs an image or voice signal received by the display apparatus 200. Here, the user output interface 150 may include an LED interface 151, a vibration interface 152 generating vibration, a sound output interface 153 outputting sound, a display 154 outputting images, and the like. For example, the remote controller 100A may receive an output signal such as audio, video, or data from the user output interface 150 and display the output signal in the form of an image on the display 154, an audio on the sound output interface 153, or a vibration on the vibration interface 152.
And a power supply 160 for providing operation power support for each element of the control device 100 under the control of the controller 110. In the form of a battery and associated control circuitry.
A hardware configuration block diagram of the display device 200 is exemplarily shown in fig. 3. As shown in fig. 3, the display apparatus 200 may include a tuner demodulator 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a memory 260, a user interface 265, a video processor 270, a display 275, a rotating assembly 276, an audio processor 280, an audio output interface 285, and a power supply 290.
The rotating assembly 276 may include a driving motor, a rotating shaft, and the like. Wherein, the driving motor can be connected to the controller 250 and output the rotation angle under the control of the controller 250; one end of the rotation shaft is connected to a power output shaft of the driving motor, and the other end is connected to the display 275, so that the display 275 can be fixedly mounted on a wall or a bracket through the rotation member 276.
The rotating assembly 276 may also include other components, such as a transmission component, a detection component, and the like. Wherein, the transmission component can adjust the rotating speed and the torque output by the rotating component 276 through a specific transmission ratio, and can be in a gear transmission mode; the detection means may be composed of a sensor, such as an angle sensor, an attitude sensor, or the like, provided on the rotation shaft. These sensors may detect parameters such as the angle at which the rotating assembly 276 is rotated and transmit the detected parameters to the controller 250, so that the controller 250 can determine or adjust the state of the display apparatus 200 according to the detected parameters. In practice, rotating assembly 276 may include, but is not limited to, one or more of the components described above.
The tuner demodulator 210 receives the broadcast television signal in a wired or wireless manner, may perform modulation and demodulation processing such as amplification, mixing, and resonance, and is configured to demodulate, from a plurality of wireless or wired broadcast television signals, an audio/video signal carried in a frequency of a television channel selected by a user, and additional information (e.g., EPG data).
The tuner demodulator 210 is responsive to the user selected frequency of the television channel and the television signal carried by the frequency, as selected by the user and controlled by the controller 250.
The tuner demodulator 210 can receive a television signal in various ways according to the broadcasting system of the television signal, such as: terrestrial broadcasting, cable broadcasting, satellite broadcasting, internet broadcasting, or the like; and according to different modulation types, a digital modulation mode or an analog modulation mode can be adopted; and can demodulate the analog signal and the digital signal according to the different kinds of the received television signals.
In other exemplary embodiments, the tuning demodulator 210 may also be in an external device, such as an external set-top box. In this way, the set-top box outputs a television signal after modulation and demodulation, and inputs the television signal into the display apparatus 200 through the external device interface 240.
The communicator 220 is a component for communicating with an external device or an external server according to various communication protocol types. For example, the display apparatus 200 may transmit content data to an external apparatus connected via the communicator 220, or browse and download content data from an external apparatus connected via the communicator 220. The communicator 220 may include a network communication protocol module or a near field communication protocol module, such as a WIFI module 221, a bluetooth communication protocol module 222, and a wired ethernet communication protocol module 223, so that the communicator 220 may receive a control signal of the control device 100 according to the control of the controller 250 and implement the control signal as a WIFI signal, a bluetooth signal, a radio frequency signal, and the like.
The detector 230 is a component of the display apparatus 200 for collecting signals of an external environment or interaction with the outside. The detector 230 may include a sound collector 231, such as a microphone, which may be used to receive a user's sound, such as a voice signal of a control instruction of the user to control the display device 200; alternatively, ambient sounds may be collected that identify the type of ambient scene, enabling the display device 200 to adapt to ambient noise.
In some other exemplary embodiments, the detector 230, which may further include an image collector 232, such as a camera, a video camera, etc., may be configured to collect external environment scenes to adaptively change the display parameters of the display device 200; and the function of acquiring the attribute of the user or interacting gestures with the user so as to realize the interaction between the display equipment and the user.
In some other exemplary embodiments, the detector 230 may further include a light receiver for collecting the intensity of the ambient light to adapt to the display parameter variation of the display device 200.
In some other exemplary embodiments, the detector 230 may further include a temperature sensor, such as by sensing an ambient temperature, and the display device 200 may adaptively adjust a display color temperature of the image. For example, when the temperature is higher, the display apparatus 200 may be adjusted to display a color temperature of an image that is cooler; when the temperature is lower, the display device 200 may be adjusted to display a warmer color temperature of the image.
The external device interface 240 is a component for providing the controller 250 to control data transmission between the display apparatus 200 and an external apparatus. The external device interface 240 may be connected to an external apparatus such as a set-top box, a game device, a notebook computer, etc. in a wired/wireless manner, and may receive data such as a video signal (e.g., moving image), an audio signal (e.g., music), additional information (e.g., EPG), etc. of the external apparatus.
The external device interface 240 may include: a High Definition Multimedia Interface (HDMI) terminal 241, a Composite Video Blanking Sync (CVBS) terminal 242, an analog or digital Component terminal 243, a Universal Serial Bus (USB) terminal 244, a Component terminal (not shown), a red, green, blue (RGB) terminal (not shown), and the like.
The controller 250 controls the operation of the display device 200 and responds to the operation of the user by running various software control programs (such as an operating system and various application programs) stored on the memory 260.
As shown in fig. 3, the controller 250 includes a Random Access Memory (RAM)251, a Read Only Memory (ROM)252, a graphics processor 253, a CPU processor 254, a communication interface 255, and a communication bus 256. The RAM251, the ROM252, the graphic processor 253, and the CPU processor 254 are connected to each other through a communication bus 256 through a communication interface 255.
The ROM252 stores various system boot instructions. When the display apparatus 200 starts power-on upon receiving the power-on signal, the CPU processor 254 executes a system boot instruction in the ROM252, copies the operating system stored in the memory 260 to the RAM251, and starts running the boot operating system. After the start of the operating system is completed, the CPU processor 254 copies the various application programs in the memory 260 to the RAM251 and then starts running and starting the various application programs.
And a graphic processor 253 for generating various graphic objects such as icons, operation menus, and user input instruction display graphics, etc. The graphic processor 253 may include an operator for performing an operation by receiving various interactive instructions input by a user, and further displaying various objects according to display attributes; and a renderer for generating various objects based on the operator and displaying the rendered result on the display 275.
A CPU processor 254 for executing operating system and application program instructions stored in memory 260. And according to the received user input instruction, processing of various application programs, data and contents is executed so as to finally display and play various audio-video contents.
In some example embodiments, the CPU processor 254 may comprise a plurality of processors. The plurality of processors may include one main processor and a plurality of or one sub-processor. A main processor for performing some initialization operations of the display apparatus 200 in the display apparatus preload mode and/or operations of displaying a screen in the normal mode. A plurality of or one sub-processor for performing an operation in a state of a standby mode or the like of the display apparatus.
The communication interface 255 may include a first interface to an nth interface. These interfaces may be network interfaces that are connected to external devices via a network.
The controller 250 may control the overall operation of the display apparatus 200. For example: in response to receiving a user input command for selecting a GUI object displayed on the display 275, the controller 250 may perform an operation related to the object selected by the user input command.
Where the object may be any one of the selectable objects, such as a hyperlink or an icon. The operation related to the selected object is, for example, an operation of displaying a link to a hyperlink page, document, image, or the like, or an operation of executing a program corresponding to the object. The user input command for selecting the GUI object may be a command input through various input means (e.g., a mouse, a keyboard, a touch panel, etc.) connected to the display apparatus 200 or a voice command corresponding to a voice spoken by the user.
A memory 260 for storing various types of data, software programs, or applications for driving and controlling the operation of the display device 200. The memory 260 may include volatile and/or nonvolatile memory. And the term "memory" includes the memory 260, the RAM251 and the ROM252 of the controller 250, or a memory card in the display device 200.
In some embodiments, the memory 260 is specifically used for storing an operating program for driving the controller 250 of the display device 200; storing various application programs built in the display apparatus 200 and downloaded by a user from an external apparatus; data such as visual effect images for configuring various GUIs provided by the display 275, various objects related to the GUIs, and selectors for selecting GUI objects are stored.
In some embodiments, memory 260 is specifically configured to store drivers for tuner demodulator 210, communicator 220, detector 230, external device interface 240, video processor 270, display 275, audio processor 280, etc., and related data, such as external data (e.g., audio-visual data) received from the external device interface or user data (e.g., key information, voice information, touch information, etc.) received by the user interface.
In some embodiments, memory 260 specifically stores software and/or programs representing an Operating System (OS), which may include, for example: a kernel, middleware, an Application Programming Interface (API), and/or an application program. Illustratively, the kernel may control or manage system resources, as well as functions implemented by other programs (e.g., the middleware, APIs, or applications); at the same time, the kernel may provide an interface to allow middleware, APIs, or applications to access the controller to enable control or management of system resources.
A block diagram of the architectural configuration of the operating system in the memory of the display device 200 is illustrated in fig. 4. The operating system architecture comprises an application layer, a middleware layer and a kernel layer from top to bottom.
The application layer, the application programs built in the system and the non-system-level application programs belong to the application layer. Is responsible for direct interaction with the user. The application layer may include a plurality of applications such as a setup application, a post application, a media center application, and the like. These applications may be implemented as Web applications that execute based on a WebKit engine, and in particular may be developed and executed based on HTML5, Cascading Style Sheets (CSS), and JavaScript.
Here, HTML, which is called hypertext Markup Language (hypertext Markup Language), is a standard Markup Language for creating web pages, and describes the web pages by Markup tags, where the HTML tags are used to describe characters, graphics, animation, sound, tables, links, etc., and a browser reads an HTML document, interprets the content of the tags in the document, and displays the content in the form of web pages.
CSS, known as Cascading Style Sheets (Cascading Style Sheets), is a computer language used to represent the Style of HTML documents, and may be used to define Style structures, such as fonts, colors, locations, etc. The CSS style can be directly stored in the HTML webpage or a separate style file, so that the style in the webpage can be controlled.
JavaScript, a language applied to Web page programming, can be inserted into an HTML page and interpreted and executed by a browser. The interaction logic of the Web application is realized by JavaScript. The JavaScript can package a JavaScript extension interface through a browser, realize the communication with the kernel layer,
the middleware layer may provide some standardized interfaces to support the operation of various environments and systems. For example, the middleware layer may be implemented as multimedia and hypermedia information coding experts group (MHEG) middleware related to data broadcasting, DLNA middleware which is middleware related to communication with an external device, middleware which provides a browser environment in which each application program in the display device operates, and the like.
The kernel layer provides core system services, such as: file management, memory management, process management, network management, system security authority management and the like. The kernel layer may be implemented as a kernel based on various operating systems, for example, a kernel based on the Linux operating system.
The kernel layer also provides communication between system software and hardware, and provides device driver services for various hardware, such as: provide display driver for the display, provide camera driver for the camera, provide button driver for the remote controller, provide wiFi driver for the WIFI module, provide audio driver for audio output interface, provide power management drive for Power Management (PM) module etc..
In FIG. 3, user interface 265, receives various user interactions. Specifically, it is used to transmit an input signal of a user to the controller 250 or transmit an output signal from the controller 250 to the user. For example, the remote controller 100A may transmit an input signal, such as a power switch signal, a channel selection signal, a volume adjustment signal, etc., input by the user to the user interface 265, and then the input signal is transferred to the controller 250 through the user interface 265; alternatively, the remote controller 100A may receive an output signal such as audio, video, or data output from the user interface 265 via the controller 250, and display the received output signal or output the received output signal in audio or vibration form.
In some embodiments, a user may enter user commands on a Graphical User Interface (GUI) displayed on the display 275, and the user interface 265 receives the user input commands through the GUI. Specifically, the user interface 265 may receive user input commands for controlling the position of a selector in the GUI to select different objects or items. Among these, "user interfaces" are media interfaces for interaction and information exchange between an application or operating system and a user, which enable the conversion between an internal form of information and a form acceptable to the user. A commonly used presentation form of the user interface is a Graphical User Interface (GUI), which refers to a user interface related to computer operations and displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in the display screen of the electronic device, where the control may include a visual interface element such as an icon, a control, a menu, a tab, a text box, a dialog box, a status bar, a channel bar, a Widget, etc.
Alternatively, the user may input a user command by inputting a specific sound or gesture, and the user interface 265 receives the user input command by recognizing the sound or gesture through the sensor.
The video processor 270 is configured to receive an external video signal, and perform video data processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and image synthesis according to a standard codec protocol of the input signal, so as to obtain a video signal that is directly displayed or played on the display 275.
Illustratively, the video processor 270 includes a demultiplexing module, a video decoding module, an image synthesizing module, a frame rate conversion module, a display formatting module, and the like.
The demultiplexing module is configured to demultiplex an input audio/video data stream, where, for example, an input MPEG-2 stream (based on a compression standard of a digital storage media moving image and voice), the demultiplexing module demultiplexes the input audio/video data stream into a video signal and an audio signal.
And the video decoding module is used for processing the video signal after demultiplexing, including decoding, scaling and the like.
And the image synthesis module is used for carrying out superposition mixing processing on the GUI signal input by the user or generated by the user and the video image after the zooming processing by the graphic generator so as to generate an image signal for display.
The frame rate conversion module is configured to convert a frame rate of an input video, for example, convert a frame rate of an input 60Hz video into a frame rate of 120Hz or 240Hz, where a common format is implemented by using, for example, an interpolation frame method.
And a display formatting module for converting the signal output by the frame rate conversion module into a signal conforming to a display format of a display, such as converting the format of the signal output by the frame rate conversion module to output an RGB data signal.
A display 275 for receiving the image signal from the video processor 270 and displaying the video content, the image and the menu manipulation interface. The display video content may be from the video content in the broadcast signal received by the tuner-demodulator 210, or from the video content input by the communicator 220 or the external device interface 240. The display 275, while displaying a user manipulation interface UI generated in the display apparatus 200 and used to control the display apparatus 200.
And, the display 275 may include a display screen assembly for presenting a picture and a driving assembly for driving the display of an image. Alternatively, a projection device and projection screen may be included, provided display 275 is a projection display.
Rotating assembly 276, the controller may issue a control signal to cause rotating assembly 276 to rotate display 255.
The audio processor 280 is configured to receive an external audio signal, decompress and decode the received audio signal according to a standard codec protocol of the input signal, and perform audio data processing such as noise reduction, digital-to-analog conversion, and amplification processing to obtain an audio signal that can be played by the speaker 286.
Illustratively, audio processor 280 may support various audio formats. Such as MPEG-2, MPEG-4, Advanced Audio Coding (AAC), high efficiency AAC (HE-AAC), and the like.
The audio output interface 285 is used for receiving an audio signal output by the audio processor 280 under the control of the controller 250, and the audio output interface 285 may include a speaker 286 or an external sound output terminal 287, such as an earphone output terminal, for outputting to a generating device of an external device.
In other exemplary embodiments, video processor 270 may comprise one or more chips. Audio processor 280 may also comprise one or more chips.
And, in other exemplary embodiments, the video processor 270 and the audio processor 280 may be separate chips or may be integrated with the controller 250 in one or more chips.
And a power supply 290 for supplying power supply support to the display apparatus 200 from the power input from the external power source under the control of the controller 250. The power supply 290 may be a built-in power supply circuit installed inside the display apparatus 200 or may be a power supply installed outside the display apparatus 200.
For the display device 200, the controller 250 may issue a command to control the driving motor of the rotating assembly 276 to rotate by a preset angle according to the direction indicated by the command, so as to drive the display 275 to rotate by the preset angle. In general, to accommodate different viewing needs, multiple rotation angles may be preset to place the display 275 in different rotation states. For example, the rotation state may include a landscape state and a portrait state, wherein the landscape state means that the long side of the screen of the display 275 is parallel to the ground (or horizontal plane) and the short side is perpendicular to the ground (or horizontal plane), as shown in fig. 5A; the portrait screen state refers to the situation where the short side of the screen of the display 275 is parallel to the ground (or horizontal plane) and the long side is perpendicular to the ground (or horizontal plane) as shown in fig. 5B.
Obviously, the landscape mode is suitable for playing the media resources in the landscape mode with 16:9, 4:3 and other picture proportions, such as conventional video resources of movies, television series and the like. The vertical screen state is more suitable for playing pictures with the ratio of 9: and (3) media assets in vertical screen forms such as 16, 3:4 and the like, such as short videos, cartoons and the like shot by a terminal such as a mobile phone and the like.
It should be noted that, although the landscape state is mainly used for displaying horizontal media such as dramas and movies, and the portrait state is mainly used for displaying vertical media such as short videos and cartoons, the landscape state and the portrait state are only two different display states, and do not limit the displayed content. For example, vertical media such as short videos and cartoons can still be displayed in the horizontal screen state; in the vertical screen state, the horizontal media assets such as TV series and movies can still be displayed, and only in the state, the non-conforming display windows need to be zoomed and adjusted.
Different rotational states correspond to rotating assembly 276 being at different rotational angles. In practice, if the rotation angle of the rotating assembly 276 is 0 degrees when the horizontal screen state of the display 275 is defined, the rotation angle of the rotating assembly 276 is 90 degrees (or-90 degrees) when the vertical screen state of the display 275 is defined. In addition to the landscape and portrait orientations described above, the rotation assembly 276 may rotate the display 275 through any angle, such as a 180 degree inverted screen orientation, for example.
As the display device 200 plays certain videos, the display 275 needs to be automatically rotated to accommodate different forms of video assets. For example, when the display 275 is in a landscape state, the user selects to play a portrait media such as a short video. The operating system of the display device 200 may detect that the aspect ratio of the video resource is smaller than 1 and the aspect ratio of the current display 275 is larger than 1, i.e., determine that the current rotation status does not match the video resource, and automatically send a control command to the rotation component 276 to rotate the display 275 clockwise (or counterclockwise) by 90 degrees, and adjust the display 275 to the vertical screen status.
When the display device 200 displays the video image, the image quality of the video image may be further processed by an image quality adjustment algorithm. The image quality processing may include processing of an image on each frame of a video picture, such as color correction, color adjustment, and the like, and may also include processing of multi-frame data of the video picture, such as frame interpolation algorithm, motion compensation, and the like. Through image quality processing, the image display quality can be further improved, and therefore better film viewing experience is achieved under the condition that the configuration of the same equipment is the same and the size of video source resources is not increased.
In general, in order to Display a Video Screen On the Display 275, a plurality of Display processing levels, for example, a Video layer (Video layer), an OSD layer (On-Screen Display layer), and the like, are arranged in the system frame of the Display device 200. The video layer refers to a display level for presenting a playing picture on the display device 200 after the user plays the media asset, and may also be referred to as a picture layer of the display device 200. The playing picture corresponding to the media assets can be displayed on the image quality layer. The OSD layer is a display layer for providing interactive operation for a user, and may also be referred to as a ui (user interface) layer of the display apparatus 200. The video layer may perform image quality processing on a displayed image, and the OSD layer may perform post-adjustment, such as rotation and scaling, on the displayed image. And the OSD layer as a UI layer can also add various interactive controls to the display picture, such as a control homepage, to make manual adjustment to the picture.
With the rotatable display device 200, since the rotating assembly 276 cannot rotate the display 275 to a preset angle quickly due to the large volume and weight of the display 275, a user needs to wait for a certain time to view a video asset normally during the rotation of the display 275. For example, the rotation of the display 275 from the landscape state to the portrait state consumes 10-30s, and during this period, the original frame on the display 275 will tilt along with the rotation of the display 275 until the display 275 is completely rotated to the portrait state (or rotated to a predetermined angle), and the display can not be adjusted to rotate and to cover the screen. Therefore, during the period, the user needs to watch the inclined picture for a long time, and the user experience is seriously influenced.
In view of this, the present application may rotate the display 275 and rotate the display screen to make the display screen in a forward direction, so that the user can normally view the display screen during the process of rotating the display 275. In practical applications, it is more desirable to rotate the display 275 and adjust the display screen to improve the image quality of the displayed screen during rotation as much as possible. Generally, the adjustment of the screen includes rotation of the display screen and adjustment of image quality of the display screen. However, in practical applications, image quality processing and rotation adjustment cannot be performed simultaneously in the rotation process, so that the rotation television only has to perform rotation adjustment without performing image quality processing on the display screen in the rotation process.
Taking the Android system-based display device 200 as an example, when playing a video, if the video is to be dynamically rotated by any angle, the setRotation API of view can be used to set the angle information in real time, so as to achieve the rotation effect. However, limited by hardware specifications, the display apparatus 200 can perform screen rotation only when video is output using the OSD layer, and cannot perform screen rotation when video is output using the video layer. Meanwhile, for the display device 200, in order to perform image quality processing on the video to improve the display effect, the video is output by using the video layer and is subjected to image quality processing. As can be seen, the display device 200 of the conventional Android system cannot implement a video display mode that can perform image quality processing and rotate at any angle.
In order to implement simultaneous image quality processing and rotation adjustment on a display picture in a rotation process, the present application provides a video picture rotation method, as shown in fig. 6 and 7, the video picture rotation method provided by the present application includes the following steps:
s1: and acquiring a video picture to be displayed.
The video picture to be displayed refers to a picture that the display apparatus 200 is playing while the user manipulates the display 275 to rotate. For example, the current display 275 is in a landscape state, and the user performs an interactive operation by controlling the apparatus 100 to open a short video application and play a short video in any portrait state. The controller 250 automatically controls the rotation component 276 to rotate after detecting that the currently played short video frame is in the portrait state, so as to rotate the display 275 to the portrait state. Since the adjustment of the display 275 in the landscape state to the portrait state takes 10-30s, the short video content corresponding to the time period of 10-30s is the video frame to be displayed.
Obviously, the video frame to be displayed is not limited to the played video frame itself, but also includes UI interfaces provided by the operating system at this time, such as an operation homepage, a playing interface with operation controls, and the like. In practical applications, the controller 250 may extract the currently displayed picture by executing an operation program, that is, each frame of image in the video is acquired one by one, so as to acquire the video picture to be displayed. The display device 200 may also select a normal frame rate or high frame rate acquisition frequency mode according to its own hardware configuration level, for example, for the display device 200 with higher operation performance of the controller 250, the acquisition frequency with 60Hz or higher frame rate may be used, and for the display device 200 with lower operation performance of the controller 250, the acquisition frequency with 30Hz or lower may be used, but it is ensured that the display screen can still meet the basic smoothness requirement after the subsequent processing.
S2: and carrying out image quality processing on the video picture to be displayed through a video layer.
After the controller 250 obtains the video image to be displayed, it needs to perform image quality processing on the video image to be displayed in the video layer, so as to improve the display effect of the image. The image quality processing may include processing performed for a plurality of frame images and processing performed for a single frame image. The processing aiming at the multi-frame images can improve the dynamic effect of the video pictures. For example, for video content that needs to be motion compensated, after a video picture to be displayed is obtained, a motion compensation algorithm program built in the operating system may be run, and motion compensation hardware built in the display device 200 may be called, and a motion compensation frame may be inserted between multiple frames of a motion video, so as to improve a display frame rate of the motion video picture and achieve a motion compensation effect.
The processing performed on the single frame image can provide the picture quality such as color, definition, resolution and the like of the video picture to be displayed. For example, after a video picture to be displayed is obtained, each frame of picture can be extracted respectively, and then the color value of each pixel point on each frame of picture is read, and the color value of each frame of image on each frame of picture is adjusted according to a preset color correction algorithm, so that the color tone optimization is performed according to the use scene of the picture.
S3: an intermediate image is generated.
After the video frame to be displayed is processed, the controller 250 may call a relevant service to perform image capturing on the output image after the picture quality processing, so as to generate an intermediate image. Therefore, the intermediate image is video frame data obtained by intercepting the output image after the image is output by the image quality processing of the video frame to be displayed. The effect of the adjusted image quality of the display image can be kept through the interception mode, and the image display quality is improved.
In practical applications, if the controller 250 has sufficient computing power, the frame-by-frame clipping may be performed on the output image after the image quality processing, that is, the frame rate of the video frame after the image quality processing is equal to the clipping frequency of the frame, and the frame clipping is performed on each frame of the output image. For example, after the motion compensation algorithm processing, the frame rate of the video frame to be displayed is 60Hz, and the capture frequency of the controller 250 is also 60Hz, that is, 60 frames of images are captured per second. The image capture method not only can keep the image quality of the output image of the video layer, but also can keep the fluency of the video image to be displayed after image quality processing.
If the processing performance of the controller 250 is low, the output image after image quality processing may be cut at intervals, but the cut frequency may not be too low to ensure the dynamic effect of the video. For example, the frame rate of the original video to be displayed is 60Hz, but the processing performance of the current controller 250 cannot meet the requirement of 60 frames per second. Therefore, the interception frequency can be reduced to 30Hz, namely, the interception is carried out once every other frame, 30 frames of images are intercepted every second, and the video frame rate corresponding to the intercepted and output intermediate images is changed to 30 Hz.
It should be noted that the frequency of capturing the display screen can also be adjusted according to the actual application scenario. For example, when the display device 200 plays media such as motion movies, live sports events, and the like, which require motion compensation, the capturing frequency of the video image can be increased, so that the video to be displayed does not have obvious frame dropping and blocking phenomena; when the display device 200 performs still image presentation or plays media such as general short video that do not require motion compensation, the capturing frequency of the video frame can be appropriately reduced to reduce the load of the controller 250.
S4: and rotating the intermediate image through an OSD layer to generate a rotated image.
In order to keep the rotation of the television from affecting the user's viewing, the displayed picture may be rotated simultaneously during the rotation of the display 275. For example, the display 275 is to be rotated counterclockwise within 15s from the landscape state to the portrait state; the display screen may be rotated clockwise within 15s by the service of the OSD layer to keep the screen displayed on the display 275 always forward.
The rotation of the intermediate image may be determined according to a rotation instruction input by a user or according to an angle at which the display 275 is currently located. The rotation instruction refers to an instruction for controlling the rotation of the rotating assembly, and may include a trigger signal and an angle for specifying the rotation. After the rotation instruction is obtained, the controller may calculate the rotation angle amount of the display 275 per unit time according to the time consumed for rotating the display 275 to the designated state, thereby synchronously setting the rotation service to the OSD layer, and enabling the service to determine the angle amount by which the display screen needs to be rotated according to the rotation angle amount per unit time and the current time.
Since the display 275 is at different angles as the rotation of the display 275 progresses during the rotation, the display screen during the rotation needs to be rotated by an angle corresponding to the rotation angle of the display 275 to ensure that the screen mode is maintained in the forward direction. For example, when the display 275 is rotated clockwise to 30 degrees, the display 275 is tilted, and the display screen needs to be rotated counterclockwise by 30 degrees to ensure that the display screen is in the forward direction.
S5: displaying the rotated image frame by frame.
After the display screen is rotated, a video stream may be formed from the generated rotated image, and at this time, the formed video stream may be displayed on the display 275. Because the displayed image is rotated, the image can adapt to the rotation process of the display 275 during display, so that the display picture is always kept in the forward display in the rotation process of the display 275, and the influence on the viewing effect of a user is avoided. Further, since the rotated image is an image obtained by image quality processing by screen capture, the presented video screen also has an effect of image quality processing, that is, the image quality processing is performed on the display screen while the display 275 is rotated.
In one implementation, as shown in fig. 8, the step of acquiring a video frame to be displayed further includes:
s101: creating a video layer service and a player;
s102: sending the handle of the video layer service to the player;
s103: and calling the player to decode the video picture to be displayed, and sending the decoded video picture to be displayed to the video layer service.
In practical application, the service for implementing the video image rotation method can be created along with the startup and the startup of the system, and the created service can be kept in the memory after the startup and the operation of the system so as to monitor the video playing state. When a user issues an instruction to control the display device 200 to rotate, or when the system automatically determines to implement rotation, the created service may be called, thereby executing a corresponding video screen rotation method.
In order to implement picture processing at different layers, a video layer service (surface view) and a player can be created along with startup, and a handle (handle) of the video layer service is issued to the player, so that the player can be called to perform operations such as demultiplexing and decoding on a video picture to be displayed first when playing a video, and the decoded video picture is sent to the video layer service by calling the handle of the video layer service.
In one implementation, as shown in fig. 9, the step of generating an intermediate image includes:
s301: creating an interception service;
s302: acquiring an output image;
s303: and calling the interception service to intercept the output image to obtain the intermediate image.
The intercepting service is used for intercepting video frame data, the intercepting service can be created when the system is started up and initialized to run, and when the display picture needs to be rotated, the real-time intercepting video frame data of the service is called and transmitted to the OSD layer to be displayed. In the application, the video picture to be displayed after the image quality processing may form an output image, that is, the output image is image data obtained after the image quality processing is performed on the video picture to be displayed. Obviously, the output image may have a different frame rate and/or a different image picture effect from the original image to be displayed.
For example, after the video layer completes (or partially completes) the image quality processing of the video picture to be displayed, the created interception service may be invoked, and the image quality processed output image may be acquired at the video layer. For the output image, the truncation service may truncate from frame to frame or from frame to frame, obtaining an intermediate image.
In one implementation, as shown in fig. 10, the step of generating a rotated image by rotating the intermediate image through an OSD layer further includes:
s41: acquiring display rotation information;
s42: setting a picture rotation direction and a picture rotation angle;
s43: and rotating the intermediate image according to the picture rotation direction and the picture rotation angle.
After acquiring the intermediate image, the controller 250 may determine the current display rotation information by reading an angle sensor provided on the rotation assembly 276 or reading a gravitational acceleration sensor built into the display 275. Obviously, the display rotation information includes a display rotation direction and a display rotation angle. The display rotation direction may be obtained from a rotation instruction, for example, the rotation instruction specifies that the display 275 rotates from the landscape screen state to the portrait screen state, and the display rotation direction in the display rotation information is counterclockwise. The rotation angle of the display can be obtained from the value detected by the angle sensor or the gravitational acceleration sensor, and in practical application, the rotation angle of the display refers to the angle value relative to the horizontal screen state (0 degree state).
After acquiring the display rotation information, the controller 250 may set a rotation manner of the picture, i.e., a picture rotation direction and a picture rotation angle, according to the display rotation information. The frame rotation direction is opposite to the display rotation direction, and the frame rotation angle is equal to the display rotation angle. For example, if it is determined from the display rotation information that the current display 275 is rotated counterclockwise by 35 degrees, the screen rotation direction is set to be clockwise and the screen rotation angle is set to be 35 degrees. Finally, the controller 250 rotates the intermediate image again according to the screen rotation direction and the screen rotation angle. For example, the intermediate image is held in a state of being rotated clockwise by 35 degrees.
In one implementation, as shown in fig. 11, to perform a screen rotation operation, the method further includes:
s401: creating a rotation service;
s402: acquiring a rotation instruction input by a user;
s403: and if the rotation instruction is acquired, calling the rotation service to acquire display rotation information.
The rotation service is used to rotate the display screen, and obviously, the rotation service is based on an OSD layer, and may also be automatically created along with the startup of the display device 200, and is invoked and executed after the rotation instruction is obtained. The rotation command is used to drive the rotating component 276 to rotate the display 275, so that the display 275 can be in one of a plurality of rotation states.
Therefore, in the embodiment, the rotation instruction input by the user may be acquired after the rotation service is created, and after the rotation instruction input by the user is acquired, the rotation service may be invoked to acquire the display rotation information. When the rotation instruction is not acquired, the rotation service may be kept in an inactive state to avoid occupying the memory space of the display device 200.
In one implementation, as shown in fig. 12, the step of obtaining display rotation information for rotating the intermediate image includes:
s411: monitoring an angle call-back interface of the rotating assembly;
s412: and acquiring the current rotation angle of the display through the angle callback interface, and generating the rotation information of the display.
An angle callback interface is provided on the rotating assembly 276. The angle callback interface is a unified interface that the operating system sets to adapt to the rotation process of the display, and is used to obtain the rotation angle of the rotating component 276 in real time. The angle recall interface may detect the angle of rotation of the rotating assembly 276 in real time via an angle sensor, thereby determining the corresponding posture of the display 275 at any time during rotation.
Thus, by listening to the angle callback interface of the rotation component 276, the angle at which the display 275 is currently rotating can be determined, in combination with the display rotation direction specified in the rotation instruction, to generate display rotation information.
In one implementation, after the step of generating a rotated image by rotating the intermediate image through the OSD layer, the method further includes: locating a current boundary position of the display 275 according to the display rotation information; cutting the rotating image according to the current boundary position; or, scaling the rotated image according to the current boundary position.
In practical applications, since the boundary position of the display 275 changes after the display 275 is rotated to the tilted state, if the display screen is in the forward display state, the tilted display 275 cannot normally display the forward screen. For example, as shown in fig. 13, an area that cannot be displayed appears in the upper right corner and the lower left corner of the screen, and a blank area appears in other edge positions. Therefore, after the display rotation information at any moment is acquired, the position of the boundary of the display 275 at the current moment can be determined according to the display rotation information, the rotation image is cut according to the current boundary position, and a blank area is filled, for example, the upper right corner and the lower left corner of the picture are cut according to the screen boundary position of the display 275, so that the current state of the picture can be maintained, and the picture is displayed; and other edge positions of the picture are filled with a pure color background so as to improve the display quality.
After the step of generating the rotated image, the rotated image may be zoomed according to the current boundary position, that is, the rotated image may be zoomed according to the maximum forward screen display area that can be accommodated by the screen of the display 275 in the current inclined state. Since the inclination state of the screen of the display 275 is changed during the rotation process, the maximum forward picture area that can be accommodated is also changed, and therefore, the rotated image can be gradually zoomed according to the rotation process, so as to achieve a smooth transition display effect.
That is, in one implementation, as shown in fig. 14, the step of scaling the rotated image according to the current boundary position includes:
s441: positioning the current display window position and the target display window position according to the display rotation information;
s442: comparing the target display window position with the current display window position to generate a width and height deformation;
s443: generating a scaling step length;
s444: and adjusting the width and height values of the rotating image according to the scaling step.
The current display window position and the target display window position refer to screen display window positions before and after the rotation of the display 275 corresponding to the rotation state of the display 275. For example, in the process that the display 275 needs to rotate from the landscape screen state to the portrait screen state, the picture position of the corresponding display playing video in the landscape screen state is the current display window position; and the position of the picture corresponding to the displayed video in the vertical screen state of the rotated display 275 is the position of the target window.
Similarly, in the process that the display 275 needs to rotate from the vertical screen state to the horizontal screen state, the picture position of the corresponding display playing video in the vertical screen state is the current display window position; and the position of the frame corresponding to the displayed video in the horizontal screen state of the rotated display 275 is the position of the target window.
In this embodiment, in order to always fill a large amount of the display area of the display 275 and ensure the quality of the display screen during the rotation of the display 275, the display screen may be scaled in an equal proportion during the rotation of the display screen.
Specifically, after the media asset frame is played, the position of the target display window may be determined according to the rotation angle and the screen size characteristics of the display 275. For example, if the horizontal screen state needs to be switched to the vertical screen state, the extracted rotation angle is 90 degrees counterclockwise, so as to determine that the predetermined target window position is the position corresponding to the playing area in the vertical screen state.
After the position of the target display window is determined, the width and height deformation of the display window can be determined by comparing the position of the target display window with the position of the current display window. For example, the size of the 65-inch display is 1440 × 810(mm), and the vertical media is displayed in the landscape state, the height of the display screen is less than or equal to the height of the display 275, i.e., less than or equal to 810mm, while the vertical media is displayed in the portrait state, which may be less than or equal to 1440mm (the width of the display in the landscape state), so that the deformation amount of the height of the display screen may be determined to 1440-. And similarly, the width deformation can be calculated according to the aspect ratio of the vertical medium resources.
After determining the amount of width and height deformation, the obtained amount of width and height scaling per unit time, i.e., the scaling step size, may be calculated from the amount of width and height deformation and the rotational speed of rotating assembly 276. And the scaling step length is the unit time width and height scaling quantity obtained by calculating according to the width and height deformation quantity and the rotating speed of the rotating assembly. For example, if the display 275 were to be rotated from landscape to portrait at 15s, the height scaling step would be 630/15-42 mm/s. Similarly, the scaling step of the width can also be calculated according to the deformation amount of the width.
After the scaling step lengths of the height and the width of the display picture are respectively determined, the width and the height of the window of the display picture can be simultaneously adjusted according to the scaling step lengths. Therefore, when the display 275 rotates to the target state, the display screen just fills the display area of the display 275, and the user experience is improved.
In order to obtain better display effect, in the step of simultaneously adjusting the width and height values of the display window according to the zoom step, the controller 250 may be further configured to: positioning a zoom base point coordinate in a current display window; and synchronously adjusting the width and height values of the display picture window according to the zooming step length by taking the zooming base point coordinate as a reference.
The zoom base point may be positioned according to the shape and the adjustment manner of the display window, for example, a center point of the display window may be selected as the zoom base point, so that the position of the base point is fixed before and after the rotation of the display screen. One of the four vertices of the window boundary may also be used as a scaling base point, such as the top left vertex.
According to the technical scheme, the video image rotating method can perform image quality processing on a video layer after the video image to be displayed is acquired. And then, an intermediate image is obtained by cutting out the output image after the image quality processing, and the intermediate image is rotated through an OSD layer to generate a rotated image. And finally, displaying the rotating image frame by frame so as to realize the dynamic rotation of any angle on the video image in the video playing process, and simultaneously carrying out corresponding image quality processing on the video image so as to improve the image display effect.
Based on the above video image rotation method, the present application further provides a display device, including: a display 275, a rotating assembly 276, and a controller 250. Wherein, the display 275 and the rotating assembly 276 are connected with the controller 250 in a data manner, so that the display process of the display 275 and the rotating process of the rotating assembly 276 are controlled by the controller 250 respectively. The rotation assembly 276 is also configured to rotate the display 275 such that the display 275 is in one of a plurality of rotational states.
As shown in fig. 6, the controller 250 is configured to perform the following program steps:
s1: acquiring a video picture to be displayed;
s2: performing image quality processing on the video picture to be displayed through a video layer;
s3: generating an intermediate image; the intermediate image is video frame data obtained by intercepting an output image after the image quality of the video picture to be displayed is processed to output the image;
s4: rotating the intermediate image through an OSD layer to generate a rotated image;
s5: controlling the display to display the rotated image frame by frame.
In practice, the controller 250 may be configured to: the application creates an interception service, and the interception service is used for intercepting frame data from a video layer in real time and transmitting the data to an OSD layer; then creating a surfaceview (video layer) service and a player by applying, and issuing a handle of the surfaceview to the player; and creating a surface service for OSD layer output. The controller 250 is also called by the application to perform operations such as demultiplexing, decoding, etc., and the decoded data is sent to the video layer; then, performing image quality processing on the data through a video layer; then, by executing interception service, video layer data is intercepted in real time and transmitted to an OSD layer for display; and setting rotation angle information for the OSD layer through the service by the application to realize video rotation and display.
As can be seen from the above technical solutions, the present application further provides a display device 200, including: a display 275, a rotating assembly 276, and a controller 250. The controller 250 may control the rotation assembly 276 to rotate to drive the display 275 to rotate by a predetermined angle. In addition, the rotation of the display 275 may be controlled in synchronization with the rotation of the display video screen and the image quality processing. The controller 250 may intercept frame data having been subjected to image quality processing from the video layer to the OSD layer for rotation processing, so as to implement image quality processing on a video image to be displayed and rotate the video at any angle.
The embodiments provided in the present application are only a few examples of the general concept of the present application, and do not limit the scope of the present application. Any other embodiments extended according to the scheme of the present application without inventive efforts will be within the scope of protection of the present application for a person skilled in the art.

Claims (9)

1. A video picture rotation method, comprising:
acquiring a video picture to be displayed;
performing image quality processing on the video picture to be displayed through a video layer;
generating an intermediate image; the intermediate image is video frame data obtained by intercepting an output image after the image quality of the video picture to be displayed is processed to output the image;
rotating the intermediate image through an OSD layer to generate a rotated image;
positioning the current display window position and the target display window position according to the display rotation information; comparing the target display window position with the current display window position to generate a width and height deformation; generating a scaling step length, wherein the scaling step length is a unit time width and height scaling quantity obtained by calculation according to the width and height deformation quantity and the rotating speed of the rotating assembly; adjusting the width and height values of the rotating image according to the zooming step length;
and displaying the adjusted rotating image frame by frame.
2. The video picture rotation method according to claim 1, wherein the step of acquiring a video picture to be displayed, the method further comprises:
creating a video layer service and a player;
sending the handle of the video layer service to the player;
and calling the player to decode the video picture to be displayed, and sending the decoded video picture to be displayed to the video layer service.
3. The video picture rotation method according to claim 1, wherein the step of generating the intermediate image comprises:
creating an interception service, wherein the interception service is used for intercepting video frame data;
acquiring an output image, wherein the output image is image data obtained after the image quality of the video picture to be displayed is processed;
and calling the interception service to intercept the output image to obtain the intermediate image.
4. The video picture rotation method according to claim 1, wherein the step of generating a rotated image by rotating the intermediate image by an OSD layer further comprises:
acquiring display rotation information; the display rotation information comprises a display rotation direction and a display rotation angle;
setting a picture rotation direction and a picture rotation angle; the picture rotation direction is opposite to the display rotation direction, and the picture rotation angle is equal to the display rotation angle;
and rotating the intermediate image according to the picture rotation direction and the picture rotation angle.
5. The video picture rotation method according to claim 4, further comprising:
creating a rotation service for rotating a display screen;
acquiring a rotation instruction input by a user; the rotation instruction is used for driving the rotation assembly to drive the display to rotate so as to enable the display to be in one of multiple rotation states;
and if the rotation instruction is acquired, calling the rotation service to acquire display rotation information.
6. The video frame rotation method according to claim 4, wherein the step of acquiring display rotation information comprises:
monitoring an angle call-back interface of the rotating assembly;
and acquiring the current rotation angle of the display through the angle callback interface, and generating the rotation information of the display.
7. The video picture rotation method according to claim 4, wherein after the step of generating a rotated image by rotating the intermediate image by an OSD layer, the method further comprises:
and positioning the current boundary position of the display according to the display rotation information.
8. The method of claim 1, wherein the step of adjusting the width-to-height value of the rotated image according to the scaling step comprises:
positioning a zoom base point coordinate in a current display window;
and taking the coordinates of the zooming base point as a reference, and synchronously adjusting the width and height values of the display picture window according to the zooming step length.
9. A display device, comprising:
a display;
a rotation component configured to rotate the display to enable the display to be in one of a plurality of rotation states;
a controller configured to:
acquiring a video picture to be displayed;
performing image quality processing on the video picture to be displayed through a video layer;
generating an intermediate image; the intermediate image is video frame data obtained by intercepting an output image after the image quality of the video picture to be displayed is processed to output the image;
rotating the intermediate image through an OSD layer to generate a rotated image;
positioning the current display window position and the target display window position according to the display rotation information; comparing the target display window position with the current display window position to generate a width and height deformation; generating a scaling step length, wherein the scaling step length is a unit time width and height scaling quantity obtained by calculation according to the width and height deformation quantity and the rotating speed of the rotating assembly; adjusting the width and height values of the rotating image according to the zooming step length;
and controlling the display to display the adjusted rotating image frame by frame.
CN202010350580.2A 2020-03-13 2020-04-28 Video picture rotating method and display equipment Active CN113573118B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202010350580.2A CN113573118B (en) 2020-04-28 2020-04-28 Video picture rotating method and display equipment
PCT/CN2021/080552 WO2021180223A1 (en) 2020-03-13 2021-03-12 Display method and display device
PCT/CN2021/080553 WO2021180224A1 (en) 2020-03-13 2021-03-12 Display device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010350580.2A CN113573118B (en) 2020-04-28 2020-04-28 Video picture rotating method and display equipment

Publications (2)

Publication Number Publication Date
CN113573118A CN113573118A (en) 2021-10-29
CN113573118B true CN113573118B (en) 2022-04-29

Family

ID=78158090

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010350580.2A Active CN113573118B (en) 2020-03-13 2020-04-28 Video picture rotating method and display equipment

Country Status (1)

Country Link
CN (1) CN113573118B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113900568B (en) * 2021-11-16 2024-09-10 创盛视联数码科技(北京)有限公司 Picture rotation method, electronic device and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012220840A (en) * 2011-04-12 2012-11-12 Canon Inc Image display device and image display method
CN104040463A (en) * 2012-01-13 2014-09-10 索尼公司 Information processing device and information processing method, as well as computer program
CN104065999A (en) * 2014-06-11 2014-09-24 四川政企网络信息服务有限公司 Image processing assembly and method capable of achieving image rotation
CN104581405A (en) * 2013-10-21 2015-04-29 中国移动通信集团公司 Display content processing method and equipment
CN110933494A (en) * 2019-11-29 2020-03-27 维沃移动通信有限公司 Picture sharing method and electronic equipment

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104244045B (en) * 2014-09-28 2016-02-24 小米科技有限责任公司 The method that control video pictures presents and device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012220840A (en) * 2011-04-12 2012-11-12 Canon Inc Image display device and image display method
CN104040463A (en) * 2012-01-13 2014-09-10 索尼公司 Information processing device and information processing method, as well as computer program
CN104581405A (en) * 2013-10-21 2015-04-29 中国移动通信集团公司 Display content processing method and equipment
CN104065999A (en) * 2014-06-11 2014-09-24 四川政企网络信息服务有限公司 Image processing assembly and method capable of achieving image rotation
CN110933494A (en) * 2019-11-29 2020-03-27 维沃移动通信有限公司 Picture sharing method and electronic equipment

Also Published As

Publication number Publication date
CN113573118A (en) 2021-10-29

Similar Documents

Publication Publication Date Title
CN113395558B (en) Display equipment and display picture rotation adaptation method
CN114827694A (en) Display equipment and UI (user interface) display method during rotation
CN112565839B (en) Display method and display device of screen projection image
CN112165644B (en) Display device and video playing method in vertical screen state
CN111787388B (en) Display device
CN113395562B (en) Display device and boot animation display method
CN111866593B (en) Display device and startup interface display method
CN111866590B (en) Display device
CN112565861A (en) Display device
CN113556591A (en) Display equipment and projection screen image rotation display method
CN113395554B (en) Display device
CN113556593B (en) Display device and screen projection method
CN113573118B (en) Video picture rotating method and display equipment
WO2021180223A1 (en) Display method and display device
CN113473192B (en) Display device and starting signal source display adaptation method
CN113542824B (en) Display equipment and display method of application interface
CN113556590B (en) Method for detecting effective resolution of screen-projected video stream and display equipment
CN111314739B (en) Image processing method, server and display device
CN112565915A (en) Display apparatus and display method
CN114501087A (en) Display device
CN113015006A (en) Display apparatus and display method
CN113497965B (en) Configuration method of rotary animation and display device
CN113542823B (en) Display equipment and application page display method
CN113497962B (en) Configuration method of rotary animation and display device
CN115697771A (en) Display device and display method of application interface

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant