WO2009020381A1 - Apparatus and method for three-dimensional panoramic image formation - Google Patents

Apparatus and method for three-dimensional panoramic image formation Download PDF

Info

Publication number
WO2009020381A1
WO2009020381A1 PCT/MY2008/000084 MY2008000084W WO2009020381A1 WO 2009020381 A1 WO2009020381 A1 WO 2009020381A1 MY 2008000084 W MY2008000084 W MY 2008000084W WO 2009020381 A1 WO2009020381 A1 WO 2009020381A1
Authority
WO
WIPO (PCT)
Prior art keywords
generate
sensors
image
line
dimensional panoramic
Prior art date
Application number
PCT/MY2008/000084
Other languages
French (fr)
Inventor
Hock Woon Hon
Original Assignee
Mimos Berhad
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mimos Berhad filed Critical Mimos Berhad
Publication of WO2009020381A1 publication Critical patent/WO2009020381A1/en

Links

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B37/00Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor
    • H04N13/221Image signal generators using stereoscopic image cameras using a single 2D image sensor using the relative movement between cameras and objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/296Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2213/00Details of stereoscopic systems
    • H04N2213/001Constructional or mechanical details

Definitions

  • the present invention relates to a method and apparatus for 3-dimensional (3D) panoramic image formation, more particularly to capturing and generating a 3D image using line-scan sensors with a single projection centre and without image stitching.
  • Panoramic imaging technology has been used for merging multiple photographs of digital images to produce a single seamless 360 ° panoramic view of a particular scene.
  • a camera is usually employed in such a way that a sequence of image inputs is obtained as the camera is rotated around the focal point of the camera lens causing every two neighboring images to slightly overlap each other.
  • the intensity values from the two neighboring images in the overlap region are weighted and then summed to form a smooth transition.
  • the resultant panorama provides a 2 dimensional (2D) description of the environment.
  • a camera In order to generate 3D images, a camera set swivels at the nodal point at a constant angular interval and produces intensity images.
  • the 3D panorama is constructed by stitching neighboring 3D images like the conventional 2D panorama form by stitching two neighboring intensity images together.
  • problems arise when two adjacent 3D images in a sequence are merged. Distortion appears when a sequence of 3D images is used to describe a shape of an object.
  • U.S Patent No. 7,010,158 describes a method of modeling and reconstructing 3D scene wherein each 3D panoramic image is derived from a plurality of range images captured from a distinct spatial position.
  • the 3D panoramic images are generated by positioning a camera at a first distinct spatial location; acquiring the plurality of range images of the scene by rotating the camera about a vertical axis relative to the scene, wherein there is an inter-overlap region between adjacent images; and forming a 3D panoramic image about the vertical axis from the plurality of acquired range images.
  • a plurality of 3D panoramic images is created by repeating these steps at additional spatial positions in the scene.
  • This invention simplifies the merging process of the 3D panoramic images drastically compared to merging the entire set of individual range images. However, it still requires image stitching process for 3D panoramic image generation.
  • the present invention is directed to overcoming one or more of the problems set forth above. Briefly summarized, the present invention relates to a method for generating a 3D panoramic image which comprises:
  • the video signals are obtained by placing two rotating sensors side- by-side that share the same projection center and centre of rotation.
  • the image data are then extracted from the video signals and synchronized and build up in the system memory in order to form perspective two-dimensional images which are generated after a full cycle of 360 ° rotation.
  • the amount of padding with plain data is dependent upon the convergent distant.
  • An advantage of the present invention is that image stitching is not required to generate 3D panoramic image and it requires only a single lens or one projection centre to acquire the images.
  • a selectable coverage of the field of the view is another advantage achieved by using the tilt-able holder.
  • Fig. 1 illustrates the image formation apparatus and its peripheral controllers
  • Fig. 2 illustrates a field of view control of dual-sensor line-scan in action
  • Fig. 3a, 3b and 3c illustrate perspectives view of tilting sensors with goniometer
  • Fig. 4 shows the simplified data processing flow and potential resultant images
  • Fig. 5 shows the time integration volume covered by sensor 1 and sensor 2.
  • FIG. 1 there is illustrated an embodiment of the present invention to capture a 3D panoramic image of interest.
  • a single standard C-mount, CS-mount or F-mount lens 100 is used as a common projection centre for two high-speed line- scan sensors 110, 111 that are arranged side-by-side on the same plane, where scene in object space is backed projected to the sensors 110, 111 respectively.
  • Both sensors 110, 111 are packaged into a single unit and mounted on a motorized control goniometer 135 and a rotary stage 120 (see fig. 5) with a centre of rotation (center of the package unit).
  • the rotary stage 120 has two parts; the upper part is stationary and the lower part is movable stage.
  • a slip ring 170 is placed in between the 2 stages from the camera and to the synchronizer which are connected through the slip ring 170 to avoid entanglement of the cables.
  • a timing synchronizer 140 is used to synchronize the raw video signals from each sensor 110, 111 start time before they are captured by a signal-capturing device 150.
  • the signal-capturing device 150 may be a frame grabber or a digital signal processor (DSP).
  • DSP digital signal processor
  • a laser point source 160 is used as a reference point.
  • the laser point will be used to indicate the starting point of the scanning. This can ease the process of image registration.
  • Fig. 2 shows that the separation of the two sensors 110, 111 can be changed by a motorized control 113 at the back of the sensors 110, 111.
  • the motorized control 113 controls the overlapping field of view by opening up or closing the apertures of the two cameras to achieve an equal angle. With this capability convergent distant can be adjusted by the said apparatus.
  • Fig. 3a shows that the camera is in the center point of the goniometer 135.
  • Fig. 3b and Fig 3c further show the camera is positioned at the upper and lower position of the goniometer 135.
  • the vertical field of view for both sensors 110, 111 can be controlled by the goniometer 135.
  • the fields of view of the sensors 110, 111 can be changed by a tilt-able controller 130 (shown in fig. 5) to capture different areas in the scene when and if necessary.
  • the goniometer 135 is used to position the camera set in order to face at a certain direction for field of view.
  • two perspective images 40, 41 are produced at the end of a full rotation of each of the sensors 110, 111.
  • the full resolution of the image in the x axis represents the panoramic field of view of the surrounding scene.
  • the content of the resultant images 43, 44 shows the view from 0° to 359° in object space.
  • Each of the resultant images 43, 44 is captured by the individual sensors 110, 111 on a line-by-line-basis.
  • the image data are stored temporarily on the signal capturing device's 150 (frame grabber) memory before transferring to the system memory.
  • the integration of the 3D panoramic images 40, 41 is carried out by interleaving the two image data, either vertically or horizontally, depending on the 3D viewing device used.
  • calibration to limit the maximum allowable parallax for human eyes must be carried out in order to reduce the eyestrain caused by the maximum allowable parallax for the human eyes.
  • Fig. 5 shows time integration volume produced by both sensors 110, 111.
  • the first sensor 110 creates a different time integration volume 200 from the second sensor 111 as both sensors cover different perspective.
  • Constant speed rotation is achieved by a rotary stage 120 which is dependent upon the readout rate of the sensors 110, 111 and resolution required.
  • the relationship between the speed of the rotary stage 120 and the camera read out rate also determines the field of view covers and the detail, the shape and the aspect ratio of the panoramic images.
  • Tilt-able controller 130 serves as the main tool to tilt the sensors 110, 111 to produce different time integration volume 200 to generate the 3D view of interest.
  • the images are captured digitally by high speed scan CMOS or CCD (solid state) sensors 110, 111.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Studio Devices (AREA)

Abstract

The present invention relates to a method for generating a 3D panoramic image which includes the steps of: (a) capturing two video signals from each sensors (110, 111) that arrives in a line-by-line manner with the video synchronization pulse where the video signals are synchronized by the video synchronization pulse from each of the video signals; (b) spatial integration of a 2D image collection on a line by line basis over a certain period of time depending on the rotation speed of a rotary stage (120); (c) temporal determination of the video signal with reference to the first sensor (110) for a given convergent distant; (d) compensating the time difference in order to achieve viewable disparity by padding with plain image data. The amount of padding is dependent upon the convergent distant; (e) integrating both resultant perspective images (43, 44) to produce a single 3D panoramic image.

Description

APPARATUS AND METHOD FOR THREE-DIMENSIONAL PANORAMIC IMAGE
FORMATION
The present invention relates to a method and apparatus for 3-dimensional (3D) panoramic image formation, more particularly to capturing and generating a 3D image using line-scan sensors with a single projection centre and without image stitching.
BACKGROUND TO THE INVENTION
Panoramic imaging technology has been used for merging multiple photographs of digital images to produce a single seamless 360° panoramic view of a particular scene. A camera is usually employed in such a way that a sequence of image inputs is obtained as the camera is rotated around the focal point of the camera lens causing every two neighboring images to slightly overlap each other. The intensity values from the two neighboring images in the overlap region are weighted and then summed to form a smooth transition. The resultant panorama provides a 2 dimensional (2D) description of the environment.
Generally, multiple cameras are usually utilized in order to obtain both intensity and 3D panorama. There have been systems producing depth panoramic images. An example of such system is found in the U.S. Patent No. 6,023,588 which utilizes a side-by-side camera system in imitating a human viewer. However, this is impossible by using the side-by-side configuration. One solution displaces the camera vertically such that the line between the rear-nodal points of the cameras is aligned with the rotation axis.
In order to generate 3D images, a camera set swivels at the nodal point at a constant angular interval and produces intensity images. The 3D panorama is constructed by stitching neighboring 3D images like the conventional 2D panorama form by stitching two neighboring intensity images together. However, problems arise when two adjacent 3D images in a sequence are merged. Distortion appears when a sequence of 3D images is used to describe a shape of an object.
U.S Patent No. 7,010,158 describes a method of modeling and reconstructing 3D scene wherein each 3D panoramic image is derived from a plurality of range images captured from a distinct spatial position. The 3D panoramic images are generated by positioning a camera at a first distinct spatial location; acquiring the plurality of range images of the scene by rotating the camera about a vertical axis relative to the scene, wherein there is an inter-overlap region between adjacent images; and forming a 3D panoramic image about the vertical axis from the plurality of acquired range images. A plurality of 3D panoramic images is created by repeating these steps at additional spatial positions in the scene. This invention simplifies the merging process of the 3D panoramic images drastically compared to merging the entire set of individual range images. However, it still requires image stitching process for 3D panoramic image generation.
SUMMARY OF THE INVENTION
The present invention is directed to overcoming one or more of the problems set forth above. Briefly summarized, the present invention relates to a method for generating a 3D panoramic image which comprises:
(a) capturing two video signals from each sensor that arrive in a line-by-line manner with synchronization pulses where the video signals are synchronized by the synchronization pulse from each of the video signals;
(b) spatial integrating a 2D image collection on a line-by-line basis over a certain period of time depending on the rotation speed of a rotary stage;
(c) assigning the first video signal as a starting mark of temporal registration with reference to the first sensor for a given convergent distant;
(d) compensating the time difference in order to achieve viewable disparity which does not exceed the human maximum allowable parallax by padding with plain image data; and
(e) integrating both the video signals to produce a single 3D panoramic image.
More specifically, the video signals are obtained by placing two rotating sensors side- by-side that share the same projection center and centre of rotation. The image data are then extracted from the video signals and synchronized and build up in the system memory in order to form perspective two-dimensional images which are generated after a full cycle of 360° rotation. The amount of padding with plain data is dependent upon the convergent distant.
An advantage of the present invention is that image stitching is not required to generate 3D panoramic image and it requires only a single lens or one projection centre to acquire the images. A selectable coverage of the field of the view is another advantage achieved by using the tilt-able holder.
Another significant advantage with the setup that uses two line-scan sensors with a single lens is that the calibration of the sensor in terms of their spatial position, raw, pitch and yaw with respect to each other is much easier as only a single lens is used. The lens is used as a reference point for calibration for both the line-scan sensors.
These and other aspects, objects, features and advantages of the present invention will be more clearly understood and appreciated from a review of the following detailed description of the preferred embodiment and appended claims, and by reference to the accompanying drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
The specific features, aspects, and advantages of the present invention will become better understood with regard to the following description, appended claims, and accompanying drawings where:
Fig. 1 illustrates the image formation apparatus and its peripheral controllers;
Fig. 2 illustrates a field of view control of dual-sensor line-scan in action;
Fig. 3a, 3b and 3c illustrate perspectives view of tilting sensors with goniometer,
Fig. 4 shows the simplified data processing flow and potential resultant images; and
Fig. 5 shows the time integration volume covered by sensor 1 and sensor 2.
DETAILED DESCRIPTION OF THE INVENTION
In the following description of the preferred embodiments of the present invention, reference is made to the accompanying drawings which form a part hereof, and in which is shown by way of illustration specific embodiments in which the invention may be practiced. It is understood that other embodiments may be utilized and structural changes may be made without departing from the scope of the present invention.
Referring to Fig. 1 , there is illustrated an embodiment of the present invention to capture a 3D panoramic image of interest. A single standard C-mount, CS-mount or F-mount lens 100 is used as a common projection centre for two high-speed line- scan sensors 110, 111 that are arranged side-by-side on the same plane, where scene in object space is backed projected to the sensors 110, 111 respectively. Both sensors 110, 111 are packaged into a single unit and mounted on a motorized control goniometer 135 and a rotary stage 120 (see fig. 5) with a centre of rotation (center of the package unit).
The rotary stage 120 has two parts; the upper part is stationary and the lower part is movable stage. A slip ring 170 is placed in between the 2 stages from the camera and to the synchronizer which are connected through the slip ring 170 to avoid entanglement of the cables.
A timing synchronizer 140 is used to synchronize the raw video signals from each sensor 110, 111 start time before they are captured by a signal-capturing device 150. The signal-capturing device 150 may be a frame grabber or a digital signal processor (DSP). The synchronization plays a vital role in determining the integrity of the image quality as the video signal might not arrive to the image capture device as the exact same time.
A laser point source 160 is used as a reference point. The laser point will be used to indicate the starting point of the scanning. This can ease the process of image registration.
Fig. 2 shows that the separation of the two sensors 110, 111 can be changed by a motorized control 113 at the back of the sensors 110, 111. The motorized control 113 controls the overlapping field of view by opening up or closing the apertures of the two cameras to achieve an equal angle. With this capability convergent distant can be adjusted by the said apparatus. Fig. 3a shows that the camera is in the center point of the goniometer 135. Fig. 3b and Fig 3c further show the camera is positioned at the upper and lower position of the goniometer 135. The vertical field of view for both sensors 110, 111 can be controlled by the goniometer 135. The fields of view of the sensors 110, 111 can be changed by a tilt-able controller 130 (shown in fig. 5) to capture different areas in the scene when and if necessary. The goniometer 135 is used to position the camera set in order to face at a certain direction for field of view.
Referring to Fig. 4, two perspective images 40, 41 are produced at the end of a full rotation of each of the sensors 110, 111. The full resolution of the image in the x axis (the moving axis) represents the panoramic field of view of the surrounding scene. The content of the resultant images 43, 44 shows the view from 0° to 359° in object space.
Each of the resultant images 43, 44 is captured by the individual sensors 110, 111 on a line-by-line-basis. The image data are stored temporarily on the signal capturing device's 150 (frame grabber) memory before transferring to the system memory.
The integration of the 3D panoramic images 40, 41 is carried out by interleaving the two image data, either vertically or horizontally, depending on the 3D viewing device used. In the course of integrating the 3D information from the two perspective images 40, 41, calibration to limit the maximum allowable parallax for human eyes must be carried out in order to reduce the eyestrain caused by the maximum allowable parallax for the human eyes.
Fig. 5 shows time integration volume produced by both sensors 110, 111. The first sensor 110 creates a different time integration volume 200 from the second sensor 111 as both sensors cover different perspective.
Constant speed rotation is achieved by a rotary stage 120 which is dependent upon the readout rate of the sensors 110, 111 and resolution required. The relationship between the speed of the rotary stage 120 and the camera read out rate also determines the field of view covers and the detail, the shape and the aspect ratio of the panoramic images.
Tilt-able controller 130 serves as the main tool to tilt the sensors 110, 111 to produce different time integration volume 200 to generate the 3D view of interest. The images are captured digitally by high speed scan CMOS or CCD (solid state) sensors 110, 111.

Claims

1. A method to generate three-dimensional panoramic images comprising the steps of: capturing a first video signal of an object of interest by a first sensor (110) that arrives in a line-by-line manner with video synchronization pulses to differentiate two consecutive video lines; capturing a second video signal of an object of interest by a second sensor (111) that arrives in a line-by-line manner with video synchronization pulses to differentiate two consecutive video lines which has a different perspective from the first sensor (110); spatial integrating a two-dimensional image on a line-by-line basis over a certain period of time; assigning the first video signal as a starting mark of temporal registration; detecting the offset disparity of the second sensor (111 ) in terms of time difference; compensating the time difference with padding image data to reduce the image disparity or parallax; and integrating both resultant perspective images (43, 44) to produce a single three dimensional image of the scene of interest.
2. A method to generate three-dimensional panoramic images according to claim 1, wherein the first image (40) and second image (41) are visible light images if a standard visible light line-scan sensor is used as the capturing device.
3. A method to generate three-dimensional panoramic images according to claim 1 , wherein the first image (40) and second image (41) are thermal images if a thermal sensor is used as the capturing device.
4. A method to generate three dimensional panoramic images according to claim 1 , wherein two perspective 360° images would be produced at the end of a full rotation of each sensors (110, 111 ).
5. An apparatus setup to generate three-dimensional panoramic images comprising: two line-scan sensors (110, 111) arranged side-by-side on the same plane to capture video signals of object of interest; a motorized tilt-able controller (130) to alter the field of view of the sensors
(110, 111) depending on the object of interest; a timing synchronizer (140) to synchronize the raw video signals from each sensor; a signal-capturing device (150); and a slip ring (170) to avoid cable entanglement during high speed rotation.
6. An apparatus setup to generate three-dimensional panoramic images according to claim 5, further comprising a rotary stage (120) for the sensors (110, 111 ) to capture images of the object of interest (40, 41 ).
7. An apparatus setup to generate three-dimensional panoramic images according to claim 5, wherein the tile-able holder (130) can change the coverage of the field of view of the sensors (110, 111) concurrently as they are mounted on the same base plate.
8. An apparatus setup to generate three-dimensional panoramic images according to claim 5, wherein the tilt-able controller (130) is rotated by goniometer (135) to change the angle of the sensors (110, 111 ) to cover different field of view of the object of interest (40, 41).
9. An apparatus setup to generate three-dimensional panoramic images according to claim 5, wherein the motorized control (113) controls the overlapping field of view by altering the distance between the two sensors (110, 111 ) to adjust the convergent distant.
10. An apparatus setup to generate three-dimensional panoramic images according to claim 5, further comprising a single standard C-mount, CS-mount or F- mount lens (100) as common projection center for both the sensors (110, 111 )
11. An apparatus setup to generate three-dimensional panoramic images according to claim 5, wherein the signal capturing device (150) is a digital signal processor or frame grabber.
12. An apparatus setup to generate three-dimensional panoramic images according to claim 5, further comprising a laser pointer (160) to set images of interest (40, 41) as reference starting point on the resultant images (43, 44) to ease the image registration process.
PCT/MY2008/000084 2007-08-07 2008-08-07 Apparatus and method for three-dimensional panoramic image formation WO2009020381A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
MYPI20071300 MY152181A (en) 2007-08-07 2007-08-07 Apparatus and method for three dimensional panoramic image formation
MYPI20071300 2007-08-07

Publications (1)

Publication Number Publication Date
WO2009020381A1 true WO2009020381A1 (en) 2009-02-12

Family

ID=40341519

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/MY2008/000084 WO2009020381A1 (en) 2007-08-07 2008-08-07 Apparatus and method for three-dimensional panoramic image formation

Country Status (2)

Country Link
MY (1) MY152181A (en)
WO (1) WO2009020381A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8605783B2 (en) 2009-05-22 2013-12-10 Microsoft Corporation Composite video generation
CN107093210A (en) * 2017-04-20 2017-08-25 北京图森未来科技有限公司 A kind of laser point cloud mask method and device
CN108507466A (en) * 2018-03-29 2018-09-07 大连理工大学 The method that three-dimensional precise information is obtained using two-dimentional line laser scanner
CN111818320A (en) * 2020-04-23 2020-10-23 湖南傲英创视信息科技有限公司 Week-sweeping stereoscopic panoramic video acquisition system for VR live broadcast
CN112104859A (en) * 2019-06-17 2020-12-18 苏州思善齐自动化科技有限公司 Three-dimensional vision sensor and use method thereof

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20030062727A (en) * 2002-01-18 2003-07-28 고윤화 Camera with pan/tilt
US20050029458A1 (en) * 2003-08-04 2005-02-10 Z Jason Geng System and a method for a smart surveillance system
US20060215031A1 (en) * 2005-03-14 2006-09-28 Ge Security, Inc. Method and system for camera autocalibration
US20070115351A1 (en) * 2005-11-18 2007-05-24 Mccormack Kenneth Methods and systems for enhanced motion detection

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20030062727A (en) * 2002-01-18 2003-07-28 고윤화 Camera with pan/tilt
US20050029458A1 (en) * 2003-08-04 2005-02-10 Z Jason Geng System and a method for a smart surveillance system
US20060215031A1 (en) * 2005-03-14 2006-09-28 Ge Security, Inc. Method and system for camera autocalibration
US20070115351A1 (en) * 2005-11-18 2007-05-24 Mccormack Kenneth Methods and systems for enhanced motion detection

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8605783B2 (en) 2009-05-22 2013-12-10 Microsoft Corporation Composite video generation
CN107093210A (en) * 2017-04-20 2017-08-25 北京图森未来科技有限公司 A kind of laser point cloud mask method and device
US11455772B2 (en) 2017-04-20 2022-09-27 Beijing Tusen Zhitu Technology Co., Ltd. Method and device of labeling laser point cloud
CN108507466A (en) * 2018-03-29 2018-09-07 大连理工大学 The method that three-dimensional precise information is obtained using two-dimentional line laser scanner
CN108507466B (en) * 2018-03-29 2019-06-21 大连理工大学 The method that three-dimensional precise information is obtained using two-dimentional line laser scanner
CN112104859A (en) * 2019-06-17 2020-12-18 苏州思善齐自动化科技有限公司 Three-dimensional vision sensor and use method thereof
CN111818320A (en) * 2020-04-23 2020-10-23 湖南傲英创视信息科技有限公司 Week-sweeping stereoscopic panoramic video acquisition system for VR live broadcast
CN111836032A (en) * 2020-04-23 2020-10-27 湖南傲英创视信息科技有限公司 Three-dimensional panoramic video acquisition system is swept in week

Also Published As

Publication number Publication date
MY152181A (en) 2014-08-29

Similar Documents

Publication Publication Date Title
US5835133A (en) Optical system for single camera stereo video
EP1639405B1 (en) Digital 3d/360 degree camera system
Kawanishi et al. Generation of high-resolution stereo panoramic images by omnidirectional imaging sensor using hexagonal pyramidal mirrors
JP4017579B2 (en) Imaging auxiliary device, image processing method, image processing apparatus, computer program, recording medium storing program
AU2010349740B2 (en) Frame linked 2D/3D camera system
CN104935915B (en) Imaging device, 3-D imaging system and three-D imaging method
WO2004056090A1 (en) Photographing assist device and image processing method for achieving simple stereoscopic photographing
EP0099406A1 (en) Stereoscopic television system.
KR101222104B1 (en) Method and Apparatus for Generating Omnidirectional 3D Image using Line Scan Camera
WO2004004333A1 (en) Stereoscopic panoramic video generation system
WO2009020381A1 (en) Apparatus and method for three-dimensional panoramic image formation
EP1048175B1 (en) 3d imaging with line-scanning
JPH1090814A (en) Compound eye camera and image processing method
KR101088364B1 (en) Apparatus and methods for capturing 3D images with dolly-effect zoom capability
US20110018872A1 (en) Real-time high-speed three dimensional modeling system
JPH10224820A (en) Compound-eye camera apparatus
WO2017134770A1 (en) Video synchronization device
JP2012129768A (en) Document camera, document camera control method, program, and display processing system
CN102081294B (en) Image pickup apparatus
JPH0795621A (en) Image recording and reproducing device
JPH11257953A (en) Tunnel wall surface observing apparatus
US20100289881A1 (en) Camera for multiple perspective image capturing
WO2001029649A1 (en) Image processing method and apparatus for synthesising a representation from a plurality of synchronised moving image camera
KR100581533B1 (en) Image composition apparatus of stereo camera
JP2011182003A (en) Panorama camera and 360-degree panorama stereoscopic video system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08793801

Country of ref document: EP

Kind code of ref document: A1

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 08793801

Country of ref document: EP

Kind code of ref document: A1