US10115220B2 - Method and apparatus for changing 3D display based on rotation state - Google Patents

Method and apparatus for changing 3D display based on rotation state Download PDF

Info

Publication number
US10115220B2
US10115220B2 US13/739,584 US201313739584A US10115220B2 US 10115220 B2 US10115220 B2 US 10115220B2 US 201313739584 A US201313739584 A US 201313739584A US 10115220 B2 US10115220 B2 US 10115220B2
Authority
US
United States
Prior art keywords
display apparatus
state
display
screen
objects
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US13/739,584
Other versions
US20130176301A1 (en
Inventor
Su-jin Yeon
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Assigned to SAMSUNG ELECTRONICS CO., LTD. reassignment SAMSUNG ELECTRONICS CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: YEON, SU-JIN
Publication of US20130176301A1 publication Critical patent/US20130176301A1/en
Application granted granted Critical
Publication of US10115220B2 publication Critical patent/US10115220B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B24GRINDING; POLISHING
    • B24BMACHINES, DEVICES, OR PROCESSES FOR GRINDING OR POLISHING; DRESSING OR CONDITIONING OF ABRADING SURFACES; FEEDING OF GRINDING, POLISHING, OR LAPPING AGENTS
    • B24B41/00Component parts such as frames, beds, carriages, headstocks
    • B24B41/02Frames; Beds; Carriages
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B24GRINDING; POLISHING
    • B24BMACHINES, DEVICES, OR PROCESSES FOR GRINDING OR POLISHING; DRESSING OR CONDITIONING OF ABRADING SURFACES; FEEDING OF GRINDING, POLISHING, OR LAPPING AGENTS
    • B24B47/00Drives or gearings; Equipment therefor
    • B24B47/02Drives or gearings; Equipment therefor for performing a reciprocating movement of carriages or work- tables
    • B24B47/04Drives or gearings; Equipment therefor for performing a reciprocating movement of carriages or work- tables by mechanical gearing only
    • H04N13/0022
    • H04N13/007
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/128Adjusting depth or disparity
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/172Processing image signals image signals comprising non-image signal components, e.g. headers or format information
    • H04N13/183On-screen display [OSD] information, e.g. subtitles or menus
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/04Changes in size, position or resolution of an image
    • G09G2340/0492Change of orientation of the displayed image, e.g. upside-down, mirrored
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/14Solving problems related to the presentation of information to be displayed
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/001Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
    • G09G3/003Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to produce spatial visual effects

Definitions

  • the present invention relates generally to a Three-Dimensional (3D) display apparatus and a method thereof, and more particularly to a 3D display apparatus and a method that can change a screen display state depending on the rotating state of the 3D display apparatus.
  • 3D Three-Dimensional
  • TV Television
  • PC Personal Computer
  • PDA Personal Data Assistant
  • devices having 3D display functions have recently proliferated.
  • Such devices may be implemented by devices such as a 3D TV used in homes and in devices such as a 3D television receiver, monitors, a mobile phone, a PDA, a set top PC, a tablet PC, a digital photo frame, and a kiosk.
  • 3D display technology may be used in diverse fields that require 3D imaging, such as science, medicine, design, education, advertisement, and computer games.
  • a screen that includes a plurality of objects having different depth perceptions is displayed.
  • a user perceives the 3D effect due to a difference in depth perception between respective objects.
  • the user may require a 3D screen. That is, various types of menus, which are displayed on the screen to control the operation of the 3D display apparatus, may visually conflict with the objects that are displayed in 3D on the screen. Further, the menus may be hidden by the objects being displayed or the objects may be hidden by the menus being displayed on the screen to cause the menu selection and operation to be difficult.
  • the present invention has been made to address at least the above problems and/or disadvantages and to provide at least the advantages described below.
  • an aspect of the present invention provides a 3D display apparatus and a method thereof, which can adjust the depth perceptions of objects displayed on a screen if the apparatus is rotated while a 3D display is performed.
  • a 3D display method in a 3D display apparatus includes displaying a 3D screen including a plurality of objects having different depth perceptions, and displaying the 3D screen with a unified depth perception through adjustment of the depth perceptions of the plurality of objects to one depth perception if the 3D display apparatus moves to a first state.
  • a 3D display apparatus includes a display unit displaying a 3D screen including a plurality of objects having different depth perceptions, a sensing unit sensing a movement state of the 3D display device, and a control unit controlling the display unit to display the 3D screen with a unified depth perception through adjustment of the depth perceptions of the plurality of objects to one depth perception if the 3D display apparatus moves to a first state.
  • the screen display state is changed depending on the rotating state of the 3D display apparatus, and thus the user can control the operation of the 3D display apparatus more conveniently and easily.
  • FIG. 1 illustrates the configuration of a 3D display apparatus according to an embodiment of the present invention
  • FIGS. 2 to 6 illustrate a 3D display method in a 3D display apparatus according to embodiments of the present invention.
  • FIGS. 7 and 8 illustrate a 3D display method according to diverse embodiments of the present invention.
  • FIG. 1 illustrates the configuration of a 3D display apparatus according to an embodiment of the present invention.
  • a 3D display apparatus according to an embodiment of the present invention may be implemented by a device having mobility, such as a mobile phone, a PDA, a tablet PC, an electronic book, or a digital photo frame.
  • the display unit 110 displays a 3D screen including a plurality of objects having different depth perceptions.
  • the 3D screen means a screen which displays, in a 3D method, content provided from a storage device (not illustrated) that is provided in or connected to the 3D display apparatus 100 , various types of recording medium reproduction devices (not illustrated) connected to the 3D display apparatus 100 , or an external source (not illustrated) such as a broadcasting station or a web server. Examples of the 3D screen may include a broadcasting program screen, a multimedia content reproduction screen, an application execution screen, and a web page screen.
  • the application execution screen means a screen that is provided when an application installed in the 3D display apparatus 100 or the external device is executed.
  • the display unit 110 may be driven in manners depending on the 3D display method. That is, the 3D display method may be divided into a glasses type and a non-glasses type depending on whether 3D glasses are worn.
  • the glasses type 3D display method may be further divided into a shutter glass type and a polarization type.
  • the shutter glass type 3D display method is a method in which a synchronization signal is transmitted to the 3D glasses so that a left-eye shutter glass and a right-eye shutter glass are alternately opened at a corresponding image output time while a left-eye image and a right-eye image are alternately displayed through the display unit 110 .
  • the display unit 110 alternately displays the left-eye image and the right-eye image.
  • the left-eye image and the right-eye image mean image frames configured so that the same objects are spaced apart from each other to have disparities corresponding to the depth perceptions of the objects.
  • object 1 displayed in the left-eye image and object 1 displayed in the right-eye image are spaced apart from each other for disparity 1
  • object 2 displayed in the left-eye image and object 2 displayed in the right-eye image are spaced apart from each other for disparity 2
  • the depth perceptions of object 1 and object 2 become different from each other.
  • the polarization type 3D image display method is a method in which a left-eye image and a right-eye image are divided for each line, and the divided left-eye image line and right-eye image line are alternately arranged to generate and output at least one image frame.
  • the display unit 110 causes the respective lines have different polarization directions by using a polarizing film attached to a panel.
  • the 3D glasses that a user wears have the left-eye glass and the right-eye glass that transmit lights having different polarization directions. Accordingly, the left-eye image line is only recognized by the left eye, and the right-eye image line is recognized only by the right eye, so that a viewer can feel the 3D effect corresponding to the object disparity between the left-eye image and the right-eye image.
  • the display unit 110 includes a lenticular lens array or a parallax barrier.
  • the display unit 110 divides the left-eye image and the right-eye image for each line and alternately arranges the respective lines to form and display at least one image frame.
  • the light for each line of the image frame is dispersed to a plurality of viewing areas by the lenticular lens array or the parallax barrier.
  • the respective viewing areas may be formed at an interval of about 65 mm that is the binocular disparity of a human.
  • the left-eye image and the right-eye image may form a 3D image of broadcasting content or multimedia reproduction content, or a 3D image that includes types of User Interface (UI) windows or objects, such as widget, image, and text.
  • UI User Interface
  • the sensing unit 130 senses the motion state of the 3D display apparatus 100 .
  • the motion states include states to which the 3D display apparatus can move, such as an inclined state, a rotating state, and a movement state.
  • the sensing unit 130 may include a geomagnetic sensor, a gyro sensor, and an acceleration sensor. Accordingly, the sensing unit 130 can sense whether the 3D display apparatus 100 is placed in the vertical direction or in the horizontal direction, or whether the 3D display apparatus 100 is in the horizontal state or in an inclined state, through measurement of a rotating angle, a pitch angle, a yaw angle, and a roll angel of the 3D display apparatus 100 .
  • the sensing unit 130 may be implemented to include at least one of the types of sensors as described above, and its sensing method may differ depending on the type of the sensor. For example, when the sensing unit is provided with a two-axis fluxgate geomagnetic sensor, the sensing unit measures the size and direction of an external magnetic field through sensing the size of an electrical signal of the two-axis fluxgate that is changed depending on the rotating state thereof. Since the output values detected from the respective fluxgates is affected by the inclination, the pitch angle, the roll angle, and the yaw angle of the 3D display apparatus 100 may be calculated using the output values.
  • the sensing unit 130 determines that the 3D display apparatus 100 is placed in the horizontal state based on the ground surface, while if the pitch angle and the roll angle become 90 degrees, the sensing unit 130 determines that the 3D display apparatus 100 is put upright in the vertical or horizontal direction.
  • the sensing unit 130 can also determine the rotating direction of the 3D display apparatus according to the size and sign of the yaw angle. These values may differ depending on the direction of the sensor put in the 3D display apparatus 100 .
  • the sensing unit 130 may adopt types of sensors that are known in the art, and the detailed description and illustration thereof will be omitted.
  • the control unit 120 controls the operation of the display unit 110 depending on the result of the sensing by the sensing unit 130 . Specifically, if it is sensed that the 3D display apparatus 100 moves to the first state, the control unit 120 may control the display unit 110 to unify the depth perceptions of the respective objects that are being displayed through the display unit 110 into one depth perception.
  • the display unit 110 may unify the depth perceptions by adjusting the disparity through shifting the positions of the objects displayed in the left-eye image and the right-eye image for forming the 3D screen.
  • the first state may be defined in several manners.
  • the first state may be a horizontal state in which the 3D display apparatus 100 is horizontally put on the ground surface, a state in which the 3D display apparatus 100 is rotated in a horizontal or vertical direction, a state in which the 3D display apparatus 100 is inclined over a predetermined inclination, a state in which the 3D display apparatus 100 is rotated over a predetermined rotating angle, a state in which the 3D display apparatus 100 moves toward or away from the user side while maintaining the inclination, or a state in which the 3D display apparatus 100 is moving to form a specified pattern.
  • the horizontal state is defined as the first state.
  • FIG. 2 illustrates the operation of a 3D display apparatus according to an embodiment of the present invention.
  • the 3D display apparatus 100 displays a 3D screen 10 in a state in which the 3D display apparatus 100 is inclined at a predetermined inclination based on the horizontal surface.
  • a plurality of objects Ob 1 , Ob 2 , and Ob 3 having different depth perceptions are displayed.
  • the depth perceptions of the respective objects Ob 1 , Ob 2 , and Ob 3 are denoted by D 0 , D 1 , and D 2 .
  • D 0 denotes the depth feeling which corresponds to the same plane as the screen of the 3D display apparatus 100
  • D 1 denotes the depth feeling which corresponds to a plane that projects for a predetermined distance from the screen in a user direction
  • D 2 denotes the depth perception which corresponds to a plane that projects further from D 1 .
  • the depth perceptions of the respective objects Ob 1 , Ob 2 , and Ob 3 are adjusted to be unified into one depth perception. That is, as shown in FIG. 2 , the depth perceptions of the objects are unified into D 1 . Although it is shown that the depth perceptions of the objects are unified into D 1 in FIG. 2 , the depth perceptions of the objects may also be unified into D 2 or another depth perception through which the 3D display state can be recognized. Further, although it is shown that the depth perceptions of the objects are unified into one depth perception in FIG.
  • the depth perceptions D 0 , D 1 , and D 2 may be unified into D 1
  • the depth perceptions D 3 , D 4 , and D 5 may be unified into D 2 to be displayed.
  • the screen configuration may be changed simultaneously with the adjustment of the depth perceptions.
  • the 3D display apparatus 100 is rotated to the horizontal state while displaying the 3D screen 10 that includes the respective objects Ob 1 , Ob 2 , and Ob 3 , the depth perceptions of the objects Ob 1 , Ob 2 , and Ob 3 are unified into D 1 , and a menu 20 for the 3D screen 10 is displayed.
  • the menu 20 includes types of selection menus 21 and 22 related to the 3D screen 10 currently displayed.
  • the selection menu related to the 3D screen 10 indicates a menu that can be selected to perform control, such as usage, adjustment, sharing, and edition, with respect to content.
  • the 3D screen 10 is a content reproduction screen
  • a reproduction or pause menu, a stop menu, a fast-forward menu, a rewind menu, and a reproduction time display menu may be included in the menu 20 .
  • icons of types of applications related to the 3D screen 10 may be implemented through the selection menus 21 and 22 .
  • FIG. 4 illustrates a 3D display method according to another embodiment of the present invention.
  • the 3D display apparatus is rotated to the horizontal state while displaying the 3D screen 10 that includes the respective objects Ob 1 , Ob 2 , and Ob 3 having the different depth perceptions D 0 , D 1 , an D 2 , the depth perceptions of the objects Ob 1 , Ob 2 , and Ob 3 are unified into one depth value, and the layout of the 3D screen 10 is changed.
  • the layout change may be performed in manners according to embodiments of the present invention. That is, the respective objects Ob 1 , Ob 2 , and Ob 3 that are dispersed at arbitrary positions may be regularly arranged for display.
  • FIG. 4 shows the instance when the distances among the respective objects Ob 1 , Ob 2 , and Ob 3 are unified.
  • the respective objects Ob 1 , Ob 2 , and Ob 3 Before the 3D screen 10 is rotated, the respective objects Ob 1 , Ob 2 , and Ob 3 have different distances s 1 , s 2 , and s 3 , but after the 3D screen 10 is rotated, the layout is changed so that the objects have the same distance s 3 .
  • Such layout change may be selectively performed depending on the types of content displayed on the 3D screen 10 . That is, the distances for general objects, as shown in FIG. 4 , may be adjusted or other feature elements, such as the shapes and sizes, may be adjusted. However, for objects included in multimedia content, such as movie or broadcasting content, and photo content, the layout change may not be performed.
  • the above-described layout change may be differently applied depending on the types of content. That is, if there are objects present that belong to the same type, such as a plurality of images or menu icons, the layout may be changed so that the distances, sizes, and shapes of the objects coincide with one another for each type. By contrast, if a plurality of objects having different types is present, the layout may be change so that the objects are grouped for each type.
  • the minimum distance among the distances s 1 , s 2 , and s 3 of the respective objects may be set as the unified distance.
  • Another default value may be set as the unified distance.
  • the 3D display apparatus 100 moves to the first state to achieve the depth perception adjustment, and then moves to the original state, the unified depth perception is adjusted to return to the original depth perceptions.
  • the menu display may be removed, and the layout may be changed to the original state.
  • FIGS. 3 and 4 it is shown that the menu display or the layout change is performed together with the depth perception unification through the movement of the 3D display apparatus 100 to the first state.
  • such operations may be performed through different motions separately from the depth perception unification operation.
  • FIG. 5 illustrates a 3D display method according to another embodiment of the present invention.
  • the 3D display apparatus 100 is rotated to a second state in a state in which the 3D screen 10 that includes the plurality of objects Ob 1 , Ob 2 , and Ob 3 having different depth perceptions D 1 , D 2 , and D 3 is displayed, a menu 20 is displayed on the 3D screen 10 while the depth perceptions of the respective objects Ob 1 , Ob 2 , and Ob 3 are maintained.
  • FIG. 5 shows that the second state is a state in which the 3D display apparatus 100 is rotated counterclockwise, the second state is not limited thereto.
  • the second state may include a state in which the 3D display apparatus 100 is rotated clockwise and a state in which the 3D display apparatus 100 is inclined to the front side or the rear side.
  • FIG. 5 shows that the 3D display apparatus 100 is put in the horizontal direction to be used and then is turned in the vertical direction, the menu 20 may be displayed even instance when the 3D display apparatus 100 is turned from the vertical direction to the horizontal direction according to the embodiment of the present invention.
  • the depth perceptions of the respective objects Ob 1 , Ob 2 , and Ob 3 are unified into one depth perception in a state in which the menu 20 is displayed.
  • FIG. 5 shows the state in which the depth perceptions are unified into D 1 .
  • the user can operate the menu 20 without being visually affected with respect to the 3D screen.
  • FIG. 6 illustrates a 3D display method according to still another embodiment of the present invention.
  • the 3D display apparatus 100 is rotated to the second state in a state in which the 3D screen 10 that includes the plurality of objects Ob 1 , Ob 2 , and Ob 3 having different depth perceptions D 1 , D 2 , and D 3 is displayed, the layout of the 3D screen 10 is changed while the depth perceptions of the respective objects Ob 1 , Ob 2 , and Ob 3 are maintained. Accordingly, the respective objects Ob 1 , Ob 2 , and Ob 3 having different distances s 1 and s 2 are arranged so that they have a constant distance such as s 3 .
  • FIG. 6 shows the state in which the depth perceptions are unified into D 1 .
  • FIGS. 5 and 6 show that the 3D display apparatus 100 moves to the second state and then moves to the first state
  • the 3D display apparatus 100 may first move to the first state and then moves to the second state.
  • the depth perception adjustment is first performed, and then the menu display or the layout change is performed when the 3D display apparatus 100 moves to the second state.
  • the screen display state may return to the previous state or the original state.
  • FIGS. 2 to 6 The types of 3D display methods as shown in FIGS. 2 to 6 may be performed through the 3D display apparatus 100 having the configuration as shown in FIG. 1 .
  • FIG. 1 illustrates the essential configuration to execute the various types of 3D display methods, an interface for connecting to an external server or device, a storage unit for storing various types of content, programs, and data, a keypad, and a touch screen, may be additionally configured.
  • FIG. 7 illustrates a 3D display method according to an embodiment of the present invention.
  • step S 720 if it is sensed that the 3D display apparatus moves to the first state in step S 720 in a state in which the 3D screen including at least one object is displayed in step S 710 , the depth perceptions of the objects that are displayed on the 3D screen are adjusted to be unified into one depth perception in step S 730 . Accordingly, the user can stably control the operation of the 3D display apparatus.
  • the menu may be displayed along with the depth perception adjustment as shown in FIG. 3 , or the layout may be changed as shown in FIG. 4 .
  • FIG. 8 illustrates a 3D display method according to another embodiment of the present invention.
  • step S 815 if the 3D display apparatus moves to the first state in step S 815 in a state in which the 3D screen including at least one object is displayed in step S 810 , the depth perceptions of the objects are adjusted to be unified into one depth perception in step S 820 .
  • step S 825 If the 3D display apparatus moves in the opposite direction and returns to the original state in the above-described state in step S 825 , the depth perception is readjusted to the original state in step S 830 .
  • the screen display state is changed in step S 840 .
  • the change of the screen display state may be a process of additionally displaying a menu on the 3D screen or a process of changing the layout through rearrangement of the objects on the 3D screen.
  • step S 845 If the 3D display apparatus returns from the second state to the previous state in step S 845 , the screen display state is readjusted to the previous state in step S 850 . These steps are performed until the 3D display is finished in step S 855 .
  • FIG. 8 illustrates that the rotation of the 3D display apparatus to the second state is performed after the 3D display apparatus is rotated to the first state, the order of performing the above-described steps is not limited.
  • first state and the second state may be defined as states, and the layout may also be changed in diverse manners.
  • a program for performing the methods according to embodiments of the present invention as described above may be stored in various types of recording media.
  • such a program may be stored in various types of recording media that can be read by a terminal, such as a RAM (Random Access Memory), a flash memory, a ROM (Read Only Memory), an EPROM (Erasable Programmable ROM), an EEPROM (Electronically Erasable and Programmable ROM), a register, a hard disk, a removable disk, a memory card, a Universal Serial Bus (USB) memory, and a CD-ROM.
  • RAM Random Access Memory
  • flash memory a ROM (Read Only Memory)
  • EPROM Erasable Programmable ROM
  • EEPROM Electrically Erasable and Programmable ROM
  • register a register
  • hard disk a hard disk
  • a removable disk a memory card
  • USB Universal Serial Bus

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Telephone Function (AREA)

Abstract

A Three-Dimensional (3D) display method is provided, and includes displaying a 3D screen including a plurality of objects having different depth perceptions, and displaying the 3D screen with a unified depth perception through adjustment of the depth perceptions of the plurality of objects to one depth perception, when a 3D display apparatus moves to a first state. Accordingly, it is possible to effectively control the operation of the display apparatus while viewing the 3D screen.

Description

PRIORITY
This application claims priority under 35 U.S.C. § 119(a) to Korean Patent Application No. 10-2012-0003352, which was filed in the Korean Intellectual Property Office on Jan. 11, 2012, the content of which is incorporated herein by reference.
BACKGROUND OF THE INVENTION
1. Field of the Invention
The present invention relates generally to a Three-Dimensional (3D) display apparatus and a method thereof, and more particularly to a 3D display apparatus and a method that can change a screen display state depending on the rotating state of the 3D display apparatus.
2. Description of the Related Art
With the development of electronic technology, various types of electronic devices have been developed and spread. In particular, types of display devices, such as a Television (TV), a mobile phone, a Personal Computer (PC), a notebook PC, and a Personal Data Assistant (PDA), have been widely used even in private homes.
As the use of display devices is increased, user needs for more diverse functions have increased. In order to meet such user needs, respective manufacturers have successively developed products having new functions.
Therefore, devices having 3D display functions have recently proliferated. Such devices may be implemented by devices such as a 3D TV used in homes and in devices such as a 3D television receiver, monitors, a mobile phone, a PDA, a set top PC, a tablet PC, a digital photo frame, and a kiosk. Further, 3D display technology may be used in diverse fields that require 3D imaging, such as science, medicine, design, education, advertisement, and computer games.
In a 3D display apparatus, a screen that includes a plurality of objects having different depth perceptions is displayed. A user perceives the 3D effect due to a difference in depth perception between respective objects. However, when a user intends to control the operation of the 3D display apparatus, the user may require a 3D screen. That is, various types of menus, which are displayed on the screen to control the operation of the 3D display apparatus, may visually conflict with the objects that are displayed in 3D on the screen. Further, the menus may be hidden by the objects being displayed or the objects may be hidden by the menus being displayed on the screen to cause the menu selection and operation to be difficult.
SUMMARY OF THE INVENTION
The present invention has been made to address at least the above problems and/or disadvantages and to provide at least the advantages described below.
Accordingly, an aspect of the present invention provides a 3D display apparatus and a method thereof, which can adjust the depth perceptions of objects displayed on a screen if the apparatus is rotated while a 3D display is performed.
According to one aspect of the present invention, a 3D display method in a 3D display apparatus includes displaying a 3D screen including a plurality of objects having different depth perceptions, and displaying the 3D screen with a unified depth perception through adjustment of the depth perceptions of the plurality of objects to one depth perception if the 3D display apparatus moves to a first state.
According to another aspect of the present invention, a 3D display apparatus includes a display unit displaying a 3D screen including a plurality of objects having different depth perceptions, a sensing unit sensing a movement state of the 3D display device, and a control unit controlling the display unit to display the 3D screen with a unified depth perception through adjustment of the depth perceptions of the plurality of objects to one depth perception if the 3D display apparatus moves to a first state.
According to embodiments of the present invention, the screen display state is changed depending on the rotating state of the 3D display apparatus, and thus the user can control the operation of the 3D display apparatus more conveniently and easily.
BRIEF DESCRIPTION OF THE DRAWINGS
The above and other aspects, features and advantages of the present invention will be more apparent from the following detailed description when taken in conjunction with the accompanying drawings, in which:
FIG. 1 illustrates the configuration of a 3D display apparatus according to an embodiment of the present invention;
FIGS. 2 to 6 illustrate a 3D display method in a 3D display apparatus according to embodiments of the present invention; and
FIGS. 7 and 8 illustrate a 3D display method according to diverse embodiments of the present invention.
DETAILED DESCRIPTION OF EMBODIMENTS OF THE INVENTION
Various embodiments of the present invention will now be described in detail with reference to the accompanying drawings. In the following description, specific details such as detailed configuration and components are merely provided to assist the overall understanding of these embodiments of the present invention. Therefore, it should be apparent to those skilled in the art that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present invention. In addition, descriptions of well-known functions and constructions are omitted for clarity and conciseness.
FIG. 1 illustrates the configuration of a 3D display apparatus according to an embodiment of the present invention. A 3D display apparatus according to an embodiment of the present invention may be implemented by a device having mobility, such as a mobile phone, a PDA, a tablet PC, an electronic book, or a digital photo frame.
Referring to FIG. 1, the display unit 110 displays a 3D screen including a plurality of objects having different depth perceptions. The 3D screen means a screen which displays, in a 3D method, content provided from a storage device (not illustrated) that is provided in or connected to the 3D display apparatus 100, various types of recording medium reproduction devices (not illustrated) connected to the 3D display apparatus 100, or an external source (not illustrated) such as a broadcasting station or a web server. Examples of the 3D screen may include a broadcasting program screen, a multimedia content reproduction screen, an application execution screen, and a web page screen. The application execution screen means a screen that is provided when an application installed in the 3D display apparatus 100 or the external device is executed.
The display unit 110 may be driven in manners depending on the 3D display method. That is, the 3D display method may be divided into a glasses type and a non-glasses type depending on whether 3D glasses are worn. The glasses type 3D display method may be further divided into a shutter glass type and a polarization type.
The shutter glass type 3D display method is a method in which a synchronization signal is transmitted to the 3D glasses so that a left-eye shutter glass and a right-eye shutter glass are alternately opened at a corresponding image output time while a left-eye image and a right-eye image are alternately displayed through the display unit 110. When the shutter glass type 3D display method is performed, the display unit 110 alternately displays the left-eye image and the right-eye image. The left-eye image and the right-eye image mean image frames configured so that the same objects are spaced apart from each other to have disparities corresponding to the depth perceptions of the objects. For example, if object 1 displayed in the left-eye image and object 1 displayed in the right-eye image are spaced apart from each other for disparity 1, and if object 2 displayed in the left-eye image and object 2 displayed in the right-eye image are spaced apart from each other for disparity 2, the depth perceptions of object 1 and object 2 become different from each other.
The polarization type 3D image display method is a method in which a left-eye image and a right-eye image are divided for each line, and the divided left-eye image line and right-eye image line are alternately arranged to generate and output at least one image frame. In this case, the display unit 110 causes the respective lines have different polarization directions by using a polarizing film attached to a panel. The 3D glasses that a user wears have the left-eye glass and the right-eye glass that transmit lights having different polarization directions. Accordingly, the left-eye image line is only recognized by the left eye, and the right-eye image line is recognized only by the right eye, so that a viewer can feel the 3D effect corresponding to the object disparity between the left-eye image and the right-eye image. In the case of the non-glass type 3D image display method, the display unit 110 includes a lenticular lens array or a parallax barrier. The display unit 110 divides the left-eye image and the right-eye image for each line and alternately arranges the respective lines to form and display at least one image frame. The light for each line of the image frame is dispersed to a plurality of viewing areas by the lenticular lens array or the parallax barrier. The respective viewing areas may be formed at an interval of about 65 mm that is the binocular disparity of a human.
The left-eye image and the right-eye image may form a 3D image of broadcasting content or multimedia reproduction content, or a 3D image that includes types of User Interface (UI) windows or objects, such as widget, image, and text.
The sensing unit 130 senses the motion state of the 3D display apparatus 100. The motion states include states to which the 3D display apparatus can move, such as an inclined state, a rotating state, and a movement state. The sensing unit 130 may include a geomagnetic sensor, a gyro sensor, and an acceleration sensor. Accordingly, the sensing unit 130 can sense whether the 3D display apparatus 100 is placed in the vertical direction or in the horizontal direction, or whether the 3D display apparatus 100 is in the horizontal state or in an inclined state, through measurement of a rotating angle, a pitch angle, a yaw angle, and a roll angel of the 3D display apparatus 100.
The sensing unit 130 may be implemented to include at least one of the types of sensors as described above, and its sensing method may differ depending on the type of the sensor. For example, when the sensing unit is provided with a two-axis fluxgate geomagnetic sensor, the sensing unit measures the size and direction of an external magnetic field through sensing the size of an electrical signal of the two-axis fluxgate that is changed depending on the rotating state thereof. Since the output values detected from the respective fluxgates is affected by the inclination, the pitch angle, the roll angle, and the yaw angle of the 3D display apparatus 100 may be calculated using the output values. If the pitch angle and the roll angle become “0”, the sensing unit 130 determines that the 3D display apparatus 100 is placed in the horizontal state based on the ground surface, while if the pitch angle and the roll angle become 90 degrees, the sensing unit 130 determines that the 3D display apparatus 100 is put upright in the vertical or horizontal direction. The sensing unit 130 can also determine the rotating direction of the 3D display apparatus according to the size and sign of the yaw angle. These values may differ depending on the direction of the sensor put in the 3D display apparatus 100.
In addition, the sensing unit 130 may adopt types of sensors that are known in the art, and the detailed description and illustration thereof will be omitted.
The control unit 120 controls the operation of the display unit 110 depending on the result of the sensing by the sensing unit 130. Specifically, if it is sensed that the 3D display apparatus 100 moves to the first state, the control unit 120 may control the display unit 110 to unify the depth perceptions of the respective objects that are being displayed through the display unit 110 into one depth perception.
The display unit 110 may unify the depth perceptions by adjusting the disparity through shifting the positions of the objects displayed in the left-eye image and the right-eye image for forming the 3D screen.
Further, according to an embodiment of the present invention, the first state may be defined in several manners. For example, the first state may be a horizontal state in which the 3D display apparatus 100 is horizontally put on the ground surface, a state in which the 3D display apparatus 100 is rotated in a horizontal or vertical direction, a state in which the 3D display apparatus 100 is inclined over a predetermined inclination, a state in which the 3D display apparatus 100 is rotated over a predetermined rotating angle, a state in which the 3D display apparatus 100 moves toward or away from the user side while maintaining the inclination, or a state in which the 3D display apparatus 100 is moving to form a specified pattern. In the description, it is assumed that the horizontal state is defined as the first state.
FIG. 2 illustrates the operation of a 3D display apparatus according to an embodiment of the present invention. Referring to FIG. 2, the 3D display apparatus 100 displays a 3D screen 10 in a state in which the 3D display apparatus 100 is inclined at a predetermined inclination based on the horizontal surface. On the 3D screen 10, a plurality of objects Ob1, Ob2, and Ob3 having different depth perceptions are displayed. The depth perceptions of the respective objects Ob1, Ob2, and Ob3 are denoted by D0, D1, and D2. D0 denotes the depth feeling which corresponds to the same plane as the screen of the 3D display apparatus 100, D1 denotes the depth feeling which corresponds to a plane that projects for a predetermined distance from the screen in a user direction, and D2 denotes the depth perception which corresponds to a plane that projects further from D1.
If the 3D display apparatus 100 is rotated in a state in which the 3D screen 10 that includes the respective objects Ob1, Ob2, and Ob3 is displayed, the depth perceptions of the respective objects Ob1, Ob2, and Ob3 are adjusted to be unified into one depth perception. That is, as shown in FIG. 2, the depth perceptions of the objects are unified into D1. Although it is shown that the depth perceptions of the objects are unified into D1 in FIG. 2, the depth perceptions of the objects may also be unified into D2 or another depth perception through which the 3D display state can be recognized. Further, although it is shown that the depth perceptions of the objects are unified into one depth perception in FIG. 2, it is also possible to unify the depth perceptions of the objects are unified into about two depth perceptions. For example, when objects having six depth perceptions, such as D0 to D5, are displayed, the depth perceptions D0, D1, and D2 may be unified into D1, and the depth perceptions D3, D4, and D5 may be unified into D2 to be displayed.
Although it is shown that only the depth perceptions are adjusted depending on the motion of the 3D display apparatus 100 in FIG. 2, the screen configuration may be changed simultaneously with the adjustment of the depth perceptions.
Referring to FIG. 3, if the 3D display apparatus 100 is rotated to the horizontal state while displaying the 3D screen 10 that includes the respective objects Ob1, Ob2, and Ob3, the depth perceptions of the objects Ob1, Ob2, and Ob3 are unified into D1, and a menu 20 for the 3D screen 10 is displayed. The menu 20 includes types of selection menus 21 and 22 related to the 3D screen 10 currently displayed. The selection menu related to the 3D screen 10 indicates a menu that can be selected to perform control, such as usage, adjustment, sharing, and edition, with respect to content. For example, if the 3D screen 10 is a content reproduction screen, a reproduction or pause menu, a stop menu, a fast-forward menu, a rewind menu, and a reproduction time display menu, may be included in the menu 20. As shown in FIG. 3, icons of types of applications related to the 3D screen 10 may be implemented through the selection menus 21 and 22.
FIG. 4 illustrates a 3D display method according to another embodiment of the present invention. Referring to FIG. 4, if the 3D display apparatus is rotated to the horizontal state while displaying the 3D screen 10 that includes the respective objects Ob1, Ob2, and Ob3 having the different depth perceptions D0, D1, an D2, the depth perceptions of the objects Ob1, Ob2, and Ob3 are unified into one depth value, and the layout of the 3D screen 10 is changed. The layout change may be performed in manners according to embodiments of the present invention. That is, the respective objects Ob1, Ob2, and Ob3 that are dispersed at arbitrary positions may be regularly arranged for display. However, the distances among the respective objects Ob1, Ob2, and Ob3 or their sizes or shapes, may be constantly unified. FIG. 4 shows the instance when the distances among the respective objects Ob1, Ob2, and Ob3 are unified. Before the 3D screen 10 is rotated, the respective objects Ob1, Ob2, and Ob3 have different distances s1, s2, and s3, but after the 3D screen 10 is rotated, the layout is changed so that the objects have the same distance s3.
Such layout change may be selectively performed depending on the types of content displayed on the 3D screen 10. That is, the distances for general objects, as shown in FIG. 4, may be adjusted or other feature elements, such as the shapes and sizes, may be adjusted. However, for objects included in multimedia content, such as movie or broadcasting content, and photo content, the layout change may not be performed.
The above-described layout change may be differently applied depending on the types of content. That is, if there are objects present that belong to the same type, such as a plurality of images or menu icons, the layout may be changed so that the distances, sizes, and shapes of the objects coincide with one another for each type. By contrast, if a plurality of objects having different types is present, the layout may be change so that the objects are grouped for each type.
When the distances among the objects are unified as shown in FIG. 4, the minimum distance among the distances s1, s2, and s3 of the respective objects may be set as the unified distance. Another default value may be set as the unified distance.
As shown in FIGS. 2 to 4, if the 3D display apparatus 100 moves to the first state to achieve the depth perception adjustment, and then moves to the original state, the unified depth perception is adjusted to return to the original depth perceptions.
In particular, if the 3D display apparatus 100 moves to the original state after the menu display or the layout change is performed as shown in FIGS. 3 and 4, the menu display may be removed, and the layout may be changed to the original state.
In the embodiments illustrated in FIGS. 3 and 4, it is shown that the menu display or the layout change is performed together with the depth perception unification through the movement of the 3D display apparatus 100 to the first state. However, such operations may be performed through different motions separately from the depth perception unification operation. These embodiments will be described with reference to FIGS. 5 and 6.
FIG. 5 illustrates a 3D display method according to another embodiment of the present invention. Referring to FIG. 5, if the 3D display apparatus 100 is rotated to a second state in a state in which the 3D screen 10 that includes the plurality of objects Ob1, Ob2, and Ob3 having different depth perceptions D1, D2, and D3 is displayed, a menu 20 is displayed on the 3D screen 10 while the depth perceptions of the respective objects Ob1, Ob2, and Ob3 are maintained. Although FIG. 5 shows that the second state is a state in which the 3D display apparatus 100 is rotated counterclockwise, the second state is not limited thereto. That is, the second state may include a state in which the 3D display apparatus 100 is rotated clockwise and a state in which the 3D display apparatus 100 is inclined to the front side or the rear side. Further, although FIG. 5 shows that the 3D display apparatus 100 is put in the horizontal direction to be used and then is turned in the vertical direction, the menu 20 may be displayed even instance when the 3D display apparatus 100 is turned from the vertical direction to the horizontal direction according to the embodiment of the present invention.
If the 3D display apparatus 100 is rotated to the first state in a state in which the 3D display apparatus 100 is rotated to the second state and the menu 20 is displayed, the depth perceptions of the respective objects Ob1, Ob2, and Ob3 are unified into one depth perception in a state in which the menu 20 is displayed. FIG. 5 shows the state in which the depth perceptions are unified into D1. As described above, the user can operate the menu 20 without being visually affected with respect to the 3D screen.
FIG. 6 illustrates a 3D display method according to still another embodiment of the present invention. Referring to FIG. 6, if the 3D display apparatus 100 is rotated to the second state in a state in which the 3D screen 10 that includes the plurality of objects Ob1, Ob2, and Ob3 having different depth perceptions D1, D2, and D3 is displayed, the layout of the 3D screen 10 is changed while the depth perceptions of the respective objects Ob1, Ob2, and Ob3 are maintained. Accordingly, the respective objects Ob1, Ob2, and Ob3 having different distances s1 and s2 are arranged so that they have a constant distance such as s3.
If the 3D display apparatus 100 moves to the first state in the above-described state, the depth perceptions of the respective objects Ob1, Ob2, and Ob3 are unified into one depth perception in a state in which the layout of the 3D screen 10 is changed. FIG. 6 shows the state in which the depth perceptions are unified into D1.
Although FIGS. 5 and 6 show that the 3D display apparatus 100 moves to the second state and then moves to the first state, the 3D display apparatus 100 may first move to the first state and then moves to the second state. In this case, the depth perception adjustment is first performed, and then the menu display or the layout change is performed when the 3D display apparatus 100 moves to the second state.
Further, even in the embodiments shown in FIGS. 5 and 6, if the 3D display apparatus 100 moves to the previous state or to the original state, the screen display state may return to the previous state or the original state.
The types of 3D display methods as shown in FIGS. 2 to 6 may be performed through the 3D display apparatus 100 having the configuration as shown in FIG. 1. Although FIG. 1 illustrates the essential configuration to execute the various types of 3D display methods, an interface for connecting to an external server or device, a storage unit for storing various types of content, programs, and data, a keypad, and a touch screen, may be additionally configured.
FIG. 7 illustrates a 3D display method according to an embodiment of the present invention.
Referring to FIG. 7, if it is sensed that the 3D display apparatus moves to the first state in step S720 in a state in which the 3D screen including at least one object is displayed in step S710, the depth perceptions of the objects that are displayed on the 3D screen are adjusted to be unified into one depth perception in step S730. Accordingly, the user can stably control the operation of the 3D display apparatus.
According to the embodiments of the present invention, the menu may be displayed along with the depth perception adjustment as shown in FIG. 3, or the layout may be changed as shown in FIG. 4.
FIG. 8 illustrates a 3D display method according to another embodiment of the present invention.
Referring to FIG. 8, if the 3D display apparatus moves to the first state in step S815 in a state in which the 3D screen including at least one object is displayed in step S810, the depth perceptions of the objects are adjusted to be unified into one depth perception in step S820.
If the 3D display apparatus moves in the opposite direction and returns to the original state in the above-described state in step S825, the depth perception is readjusted to the original state in step S830.
If the 3D display apparatus is rotated to the second state in step S835, the screen display state is changed in step S840. Specifically, the change of the screen display state may be a process of additionally displaying a menu on the 3D screen or a process of changing the layout through rearrangement of the objects on the 3D screen.
If the 3D display apparatus returns from the second state to the previous state in step S845, the screen display state is readjusted to the previous state in step S850. These steps are performed until the 3D display is finished in step S855. Although FIG. 8 illustrates that the rotation of the 3D display apparatus to the second state is performed after the 3D display apparatus is rotated to the first state, the order of performing the above-described steps is not limited.
As described above, the first state and the second state may be defined as states, and the layout may also be changed in diverse manners.
A program for performing the methods according to embodiments of the present invention as described above may be stored in various types of recording media.
Specifically, such a program may be stored in various types of recording media that can be read by a terminal, such as a RAM (Random Access Memory), a flash memory, a ROM (Read Only Memory), an EPROM (Erasable Programmable ROM), an EEPROM (Electronically Erasable and Programmable ROM), a register, a hard disk, a removable disk, a memory card, a Universal Serial Bus (USB) memory, and a CD-ROM.
While the present invention has been shown and described with reference to certain embodiments thereof, it will be understood by those skilled in the art that changes in form and detail may be made therein without departing from the spirit and scope of the present invention, as defined by the appended claims.

Claims (20)

What is claimed is:
1. A method for displaying a plurality of objects on a three-dimensional (3D) display apparatus comprising:
displaying the plurality of objects with different depth perceptions in a first layout on a 3D screen of the 3D display apparatus, wherein, in the first layout, distances between the plurality of objects are different from one another;
detecting a rotation of the 3D display apparatus;
in response to detecting the rotation of the 3D display apparatus to a first state, changing the different depth perceptions to a same depth perception and displaying the plurality of objects with the same depth perception while the first layout is maintained; and
in response to detecting the rotation of the 3D display apparatus to a second state, changing the first layout to a second layout and displaying the plurality of objects in the second layout while the different depth perceptions are maintained,
wherein, in the second layout, the plurality of objects are spaced apart from one another by one of the distances.
2. The 3D display method as claimed in claim 1, further comprising displaying a menu corresponding to the 3D screen on the 3D screen, when the 3D display apparatus rotates to the first state.
3. The 3D display method as claimed in claim 1, further comprising displaying the 3D screen with the single depth perception that unifies the depth perceptions while changing the layout of the 3D screen so that the plurality of objects displayed on the 3D screen are arranged to be spaced apart from one another for a predetermined distance, when the 3D display apparatus rotates to the first state.
4. The 3D display method as claimed in claim 3, further comprising additionally displaying a menu corresponding to the 3D screen on the 3D screen, when the 3D display apparatus is rotated to the second state.
5. The 3D display method as claimed in claim 4, further comprising removing the menu displayed on the 3D screen, when the 3D display apparatus returns from the second state to a previous state of the rotation.
6. The 3D display method as claimed in claim 1, further comprising changing a layout of the 3D screen so that the plurality of objects displayed on the 3D screen are arranged to be spaced apart from one another for a predetermined distance, when the 3D display apparatus is rotated to the second state.
7. The 3D display method as claimed in claim 6, further comprising changing the layout of the 3D screen so that the plurality of objects are rearranged to the first layout, when the 3D display apparatus returns from the second state to a previous state of the rotation.
8. The 3D display method as claimed in claim 4, wherein in the second state, the display apparatus is rotated in a horizontal or vertical direction.
9. The 3D display method as claimed in claim 1, further comprising adjusting the depth perceptions of the plurality of objects to original different depth perceptions if the 3D display apparatus returns from the first state to an original state of the rotation.
10. The 3D display method as claimed in claim 9, wherein the first state is one of a horizontal state in which the 3D display apparatus is horizontally put on a ground surface, a state in which the 3D display apparatus is rotated in a horizontal or vertical direction, a state in which the 3D display apparatus is inclined over a predetermined inclination, a state in which the 3D display apparatus is rotated over a predetermined rotating angle, a state in which the 3D display apparatus rotates toward or away from a user side while maintaining the inclination, and a state in which the 3D display apparatus rotates to form a specified pattern.
11. A Three-Dimensional (3D) display apparatus comprising:
a display configured to display a plurality of objects with different depth perceptions in a first layout, wherein, in the first layout, distances between the plurality of objects are different from one another;
a sensor configured to detect a rotation of the 3D display apparatus; and
a controller configured to
in response to detecting the rotation of the 3D display apparatus to a first state, change the different depth perceptions to a same depth perception and control the display to display the plurality of objects with the same depth perception while the first layout is maintained,
in response to detecting the rotation of the 3D display apparatus to a second state, change the first layout to a second layout and control the display to display the plurality of objects in the second layout while the different depth perceptions are maintained,
wherein, in the second layout, the plurality of objects are spaced apart from one another by one of the distances.
12. The 3D display apparatus as claimed in claim 11, wherein the controller is configured to control the display to display a menu on a 3D screen, when the 3D display apparatus rotates to the first state.
13. The 3D display apparatus as claimed in claim 12, wherein the controller is configured to control the display to unify the depth perceptions of the plurality of objects while changing the layout of the 3D screen so that the plurality of objects displayed on the 3D screen are arranged to be spaced apart from one another for a distance, when the 3D display apparatus rotates to the first state.
14. The 3D display apparatus as claimed in claim 13, wherein the controller is configured to control the display to additionally display a menu corresponding to the 3D screen on the 3D screen, when the 3D display apparatus is rotated to the second state.
15. The 3D display apparatus as claimed in claim 14, wherein the controller is configured to control the display to remove the menu displayed on the 3D screen, when the 3D display apparatus returns from the second state to a previous state of the rotation.
16. The 3D display apparatus as claimed in claim 11, wherein the controller is configured to control the display to change a layout of the 3D screen so that the plurality of objects displayed on the 3D screen are arranged to be spaced apart from one another for a predetermined distance, when the 3D display apparatus is rotated to the second state.
17. The 3D display apparatus as claimed in claim 16, wherein the controller is configured to control the display to change the layout of the 3D screen so that the plurality of objects are rearranged to the first layout, when the 3D display apparatus returns from the second state to a previous state of the rotation.
18. The 3D display apparatus as claimed in claim 14, wherein in the second state, the display apparatus is configured to rotate in a horizontal or vertical direction.
19. The 3D display apparatus as claimed in claim 11, wherein the controller is configured to control the display to adjust the depth perceptions of the plurality of objects to original different depth perceptions, when the 3D display apparatus returns from the first state to an original state of the rotation.
20. The 3D display apparatus as claimed in claim 19, wherein the first state is one of a horizontal state in which the 3D display apparatus horizontally put on a ground surface, a state in which the 3D display apparatus is rotated in a horizontal or vertical direction, a state in which the 3D display apparatus is inclined over a predetermined inclination, a state in which the 3D display apparatus is rotated over a predetermined rotating angle, a state in which the 3D display apparatus rotates toward or away from a user side while maintaining the inclination, and a state in which the 3D display apparatus rotates to form a specified pattern.
US13/739,584 2012-01-11 2013-01-11 Method and apparatus for changing 3D display based on rotation state Active 2033-05-07 US10115220B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020120003352A KR101899458B1 (en) 2012-01-11 2012-01-11 3d display apparatus and methods thereof
KR10-2012-0003352 2012-01-11

Publications (2)

Publication Number Publication Date
US20130176301A1 US20130176301A1 (en) 2013-07-11
US10115220B2 true US10115220B2 (en) 2018-10-30

Family

ID=48743598

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/739,584 Active 2033-05-07 US10115220B2 (en) 2012-01-11 2013-01-11 Method and apparatus for changing 3D display based on rotation state

Country Status (4)

Country Link
US (1) US10115220B2 (en)
EP (1) EP2803198B1 (en)
KR (1) KR101899458B1 (en)
WO (1) WO2013105794A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9930313B2 (en) * 2013-04-01 2018-03-27 Lg Electronics Inc. Image display device for providing function of changing screen display direction and method thereof
KR102201733B1 (en) * 2013-09-30 2021-01-12 엘지전자 주식회사 Apparatus and Method for Display Device
US11651749B2 (en) * 2020-11-02 2023-05-16 Panduit Corp. Display layout optimization of multiple media streams

Citations (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006293878A (en) 2005-04-14 2006-10-26 Nippon Telegr & Teleph Corp <Ntt> Image display system, image display method, and image display program
US20070046630A1 (en) 2005-08-31 2007-03-01 Samsung Electronics Co., Ltd. Method and device for controlling display according to tilt of mobile terminal using geomagnetic sensor
WO2008030005A1 (en) 2006-09-08 2008-03-13 Eun A Song Variable liquid crystal panel for displaying three dimensional picture and apparatus using the liquid crystal panel
US20100189413A1 (en) 2009-01-27 2010-07-29 Casio Hitachi Mobile Communications Co., Ltd. Electronic Device and Recording Medium
US20100188503A1 (en) * 2009-01-28 2010-07-29 Apple Inc. Generating a three-dimensional model using a portable electronic device recording
US20100208040A1 (en) 2009-02-19 2010-08-19 Jean-Pierre Guillou Preventing interference between primary and secondary content in a stereoscopic display
US20100238196A1 (en) * 2009-03-17 2010-09-23 Harris Corporation Portable electronic devices with adjustable display orientation
JP2010257160A (en) 2009-04-23 2010-11-11 Nec Casio Mobile Communications Ltd Terminal equipment, display method, and program
JP2011028263A (en) 2010-07-09 2011-02-10 Nec Casio Mobile Communications Ltd Electronic equipment and program
KR20110026811A (en) 2009-09-08 2011-03-16 엘지전자 주식회사 Mobile terminal and operation method thereof
US20110093778A1 (en) 2009-10-20 2011-04-21 Lg Electronics Inc. Mobile terminal and controlling method thereof
KR20110054256A (en) 2009-11-17 2011-05-25 엘지전자 주식회사 Mobile terminal and method for controlling thereof
US20110126159A1 (en) 2009-11-23 2011-05-26 Samsung Electronics Co., Ltd. Gui providing method, and display apparatus and 3d image providing system using the same
US20110126160A1 (en) 2009-11-23 2011-05-26 Samsung Electronics Co., Ltd. Method of providing 3d image and 3d display apparatus using the same
KR20110056775A (en) 2009-11-23 2011-05-31 삼성전자주식회사 Gui providing method related to 3d image, and display apparatus and 3d image providing system using the same
KR20110086415A (en) 2010-01-22 2011-07-28 엘지전자 주식회사 Image display device and operation controlling method for the same
US20110221777A1 (en) * 2010-03-10 2011-09-15 Hon Hai Precision Industry Co., Ltd. Electronic device with motion sensing function and method for executing functions based on movement of electronic device
WO2011123178A1 (en) 2010-04-01 2011-10-06 Thomson Licensing Subtitles in three-dimensional (3d) presentation
US20110254846A1 (en) * 2009-11-25 2011-10-20 Juhwan Lee User adaptive display device and method thereof
US20120001943A1 (en) * 2010-07-02 2012-01-05 Fujitsu Limited Electronic device, computer-readable medium storing control program, and control method
US20120081359A1 (en) * 2010-10-04 2012-04-05 Lg Electronics Inc. Mobile terminal and controlling method thereof
US20130113783A1 (en) * 2011-11-07 2013-05-09 Qualcomm Incorporated Orientation-based 3d image display
US8692853B2 (en) * 2010-09-01 2014-04-08 Lg Electronics Inc. Mobile terminal and method for controlling 3 dimension display thereof

Patent Citations (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006293878A (en) 2005-04-14 2006-10-26 Nippon Telegr & Teleph Corp <Ntt> Image display system, image display method, and image display program
US20070046630A1 (en) 2005-08-31 2007-03-01 Samsung Electronics Co., Ltd. Method and device for controlling display according to tilt of mobile terminal using geomagnetic sensor
WO2008030005A1 (en) 2006-09-08 2008-03-13 Eun A Song Variable liquid crystal panel for displaying three dimensional picture and apparatus using the liquid crystal panel
US20150189260A1 (en) 2009-01-27 2015-07-02 Ken Yoshino Electronic Device and Recording Medium
JP2010175643A (en) 2009-01-27 2010-08-12 Casio Hitachi Mobile Communications Co Ltd Electronic apparatus and program
US20100189413A1 (en) 2009-01-27 2010-07-29 Casio Hitachi Mobile Communications Co., Ltd. Electronic Device and Recording Medium
US9041779B2 (en) 2009-01-27 2015-05-26 Nec Corporation Electronic device and recording medium
US20150172643A1 (en) 2009-01-27 2015-06-18 Ken Yoshino Electronic Device and Recording Medium
US20100188503A1 (en) * 2009-01-28 2010-07-29 Apple Inc. Generating a three-dimensional model using a portable electronic device recording
US20100208040A1 (en) 2009-02-19 2010-08-19 Jean-Pierre Guillou Preventing interference between primary and secondary content in a stereoscopic display
US20100238196A1 (en) * 2009-03-17 2010-09-23 Harris Corporation Portable electronic devices with adjustable display orientation
JP2010257160A (en) 2009-04-23 2010-11-11 Nec Casio Mobile Communications Ltd Terminal equipment, display method, and program
KR20110026811A (en) 2009-09-08 2011-03-16 엘지전자 주식회사 Mobile terminal and operation method thereof
US20110093778A1 (en) 2009-10-20 2011-04-21 Lg Electronics Inc. Mobile terminal and controlling method thereof
KR20110054256A (en) 2009-11-17 2011-05-25 엘지전자 주식회사 Mobile terminal and method for controlling thereof
US20110126160A1 (en) 2009-11-23 2011-05-26 Samsung Electronics Co., Ltd. Method of providing 3d image and 3d display apparatus using the same
EP2456214A2 (en) 2009-11-23 2012-05-23 Samsung Electronics Co., Ltd. Method of providing 3D image and 3D display apparatus using the same
EP2346264A2 (en) 2009-11-23 2011-07-20 Samsung Electronics Co., Ltd. Method of providing 3D image and 3D display apparatus using the same
KR20110056775A (en) 2009-11-23 2011-05-31 삼성전자주식회사 Gui providing method related to 3d image, and display apparatus and 3d image providing system using the same
EP2456216A2 (en) 2009-11-23 2012-05-23 Samsung Electronics Co., Ltd. Method of providing 3D image and 3D display apparatus using the same
US20110126159A1 (en) 2009-11-23 2011-05-26 Samsung Electronics Co., Ltd. Gui providing method, and display apparatus and 3d image providing system using the same
EP2456217A2 (en) 2009-11-23 2012-05-23 Samsung Electronics Co., Ltd. Method of providing 3D image and 3D display apparatus using the same
EP2346263A1 (en) 2009-11-23 2011-07-20 Samsung Electronics Co., Ltd. GUI providing method, and display apparatus and 3D image providing system using the same
EP2456218A2 (en) 2009-11-23 2012-05-23 Samsung Electronics Co., Ltd. Method of providing 3D image and 3D display apparatus using the same
EP2448277A2 (en) 2009-11-23 2012-05-02 Samsung Electronics Co., Ltd. GUI providing method, and display apparatus and 3D image providing system using the same
EP2448275A2 (en) 2009-11-23 2012-05-02 Samsung Electronics Co., Ltd. GUI providing method, and display apparatus and 3D image providing system using the same
EP2448276A2 (en) 2009-11-23 2012-05-02 Samsung Electronics Co., Ltd. GUI providing method, and display apparatus and 3D image providing system using the same
EP2456215A2 (en) 2009-11-23 2012-05-23 Samsung Electronics Co., Ltd. Method of providing 3D image and 3D display apparatus using the same
US20110254846A1 (en) * 2009-11-25 2011-10-20 Juhwan Lee User adaptive display device and method thereof
KR20110086415A (en) 2010-01-22 2011-07-28 엘지전자 주식회사 Image display device and operation controlling method for the same
US20110221777A1 (en) * 2010-03-10 2011-09-15 Hon Hai Precision Industry Co., Ltd. Electronic device with motion sensing function and method for executing functions based on movement of electronic device
WO2011123178A1 (en) 2010-04-01 2011-10-06 Thomson Licensing Subtitles in three-dimensional (3d) presentation
US20120001943A1 (en) * 2010-07-02 2012-01-05 Fujitsu Limited Electronic device, computer-readable medium storing control program, and control method
JP2011028263A (en) 2010-07-09 2011-02-10 Nec Casio Mobile Communications Ltd Electronic equipment and program
US8692853B2 (en) * 2010-09-01 2014-04-08 Lg Electronics Inc. Mobile terminal and method for controlling 3 dimension display thereof
US20120081359A1 (en) * 2010-10-04 2012-04-05 Lg Electronics Inc. Mobile terminal and controlling method thereof
US20130113783A1 (en) * 2011-11-07 2013-05-09 Qualcomm Incorporated Orientation-based 3d image display

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
European Search Report dated Aug. 13, 2015 issued in counterpart application No. 13735681.2-1903, 9 pages.
European Search Report dated Jul. 17, 2018 issued in counterpart application No. 13735681.2-1209, 7 pages.
Korean Office Action dated Jan. 29, 2018 issued in counterpart application No. 10-2012-0003352, 7 pages.

Also Published As

Publication number Publication date
EP2803198B1 (en) 2021-04-28
KR20130082251A (en) 2013-07-19
EP2803198A4 (en) 2015-09-16
US20130176301A1 (en) 2013-07-11
EP2803198A1 (en) 2014-11-19
WO2013105794A1 (en) 2013-07-18
KR101899458B1 (en) 2018-09-18

Similar Documents

Publication Publication Date Title
US9826225B2 (en) 3D image display method and handheld terminal
CN105282539B (en) Curved surface multi-view image shows equipment and its control method
US10009603B2 (en) Method and system for adaptive viewport for a mobile device based on viewing angle
KR102059969B1 (en) Mobile display device
US20130009863A1 (en) Display control apparatus, display control method, and program
US20130222363A1 (en) Stereoscopic imaging system and method thereof
US9030467B2 (en) Electronic apparatus and method for displaying graphical user interface as 3D image
KR102044055B1 (en) Display unit for rotatably displaying an autostereoscopic presentation
EP3097690B1 (en) Multi-view display control
US20110149054A1 (en) 3d glasses, method for controlling 3d glasses, and method for controlling power applied thereto
CN103686133A (en) Image compensation device for open hole stereoscopic display and method thereof
CN103916655A (en) Display Apparatus And Display Method Thereof
WO2012134487A1 (en) Adaptive monoscopic and stereoscopic display using an integrated 3d sheet
CN1791230B (en) Three dimensional image display apparatus
CN104052987A (en) Image display device and image display method
US20120154559A1 (en) Generate Media
US10115220B2 (en) Method and apparatus for changing 3D display based on rotation state
CN103597824A (en) Image processing device and method thereof, and program
US20120120057A1 (en) Display Driver Circuit, Operating Method Thereof, and User Device Including the Same
TWI515457B (en) Multi-view three dimensional display system and control method thereof
US9525864B2 (en) Display apparatus and multi view providing method thereof
KR20070025221A (en) Display method by converting a 3-dimensional images to 2-dimensional images
US11145113B1 (en) Nested stereoscopic projections
KR20120071296A (en) Apparatus for cylindrical 3d image display and method thereof
KR20100044062A (en) Apparatus and method for autostereoscopic display, and personal terminal thereof

Legal Events

Date Code Title Description
AS Assignment

Owner name: SAMSUNG ELECTRONICS CO., LTD., KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:YEON, SU-JIN;REEL/FRAME:029686/0616

Effective date: 20130109

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4