TWI730482B - Plane dynamic detection system and detection method - Google Patents
Plane dynamic detection system and detection method Download PDFInfo
- Publication number
- TWI730482B TWI730482B TW108139370A TW108139370A TWI730482B TW I730482 B TWI730482 B TW I730482B TW 108139370 A TW108139370 A TW 108139370A TW 108139370 A TW108139370 A TW 108139370A TW I730482 B TWI730482 B TW I730482B
- Authority
- TW
- Taiwan
- Prior art keywords
- plane
- depth
- continuously
- inertial sensor
- camera
- Prior art date
Links
Images
Landscapes
- Length Measuring Devices By Optical Means (AREA)
Abstract
本發明揭露一種平面動態偵測系統及偵測方法,一慣性感測器可持續取得一慣性數據,一深度相機可於一觀視範圍內持續取得實體物件(例如平面或地面)的一深度影像,而一運算裝置經組態可持續判斷慣性感測器所取得的加速度及角速度是否超出一閾值,以判斷慣性感測器本身或其所搭載裝置之運動狀態,其中,運算裝置可依據加速度、深度影像座標、深度值及內參矩陣,初始化或持續更新慣性感測器於穩定狀態時,實體物件於相機座標系中的一平面方程式,亦可透過VIO演算法取得深度相機的位姿資訊,以持續修正慣性感測器於快速移動時的平面方程式。The present invention discloses a plane motion detection system and detection method. An inertial sensor can continuously obtain inertial data, and a depth camera can continuously obtain a depth image of a physical object (such as a plane or ground) within a viewing range. , And an arithmetic device is configured to continuously determine whether the acceleration and angular velocity obtained by the inertial sensor exceed a threshold to determine the motion state of the inertial sensor itself or the device it is equipped with. The arithmetic device can be based on acceleration, Depth image coordinates, depth values and internal parameter matrix. When the inertial sensor is initialized or continuously updated, a plane equation of the physical object in the camera coordinate system can also be obtained through the VIO algorithm to obtain the position and attitude information of the depth camera. Continuously modify the plane equation of the inertial sensor when it moves fast.
Description
本發明涉及計算機視覺技術,尤指一種可參考深度影像、彩色影像及慣性數據,以達成準確偵測平面及動態更新平面於三維空間中之相對位置的「平面動態偵測系統及偵測方法」。The present invention relates to computer vision technology, in particular to a "plane motion detection system and detection method" that can refer to depth images, color images, and inertial data to accurately detect a plane and dynamically update the relative position of the plane in three-dimensional space .
為了於需要3D資訊的應用(例如AR/VR服務)提供更為真實的互動效果,偵測現實中之平面即相當關鍵,若以偵測屬於地面的平面為目標,則偵測地面的方法可以為:(a)假設地面為最大的平面並利用RANSAC(Random Sample Consensus,隨機取樣)演算法,或是利用Hough Transform(霍夫轉換)演算法找到三維空間中最大的平面,並定義其為地面;(b)假設地面在影像中各掃描線(scan line)上的Z值為最大,並於修正相機姿態(roll rotation)後以影像中Z值最大且符合C曲線(fit curve C)的像素集合,定義其為地面。In order to provide more realistic interactive effects in applications that require 3D information (such as AR/VR services), it is very important to detect the plane in reality. If the target is to detect a plane that belongs to the ground, the method of detecting the ground can be It is: (a) Assume that the ground is the largest plane and use the RANSAC (Random Sample Consensus) algorithm, or use the Hough Transform (Hough Transform) algorithm to find the largest plane in the three-dimensional space, and define it as the ground (B) Assuming that the Z value of the ground on each scan line in the image is the largest, and after correcting the camera posture (roll rotation), the pixel with the largest Z value in the image and conforming to the fit curve C (fit curve C) is used Set, define it as the ground.
然而在許多情況下,前述(a)方法所假設的最大平面往往並非地面(例如影像中的最大平面可能為走廊的牆面),而可能發生RANSAC或Hough Transform演算法判斷錯誤的情形,並且,RANSAC演算法具備正確資料(inliers)至少需要占50%以上的限制,Hough Transform演算法也相當耗時;前述(b)方法也可能發生影像中Z值最大且符合C曲線的像素集合,其並非是地面的情形。However, in many cases, the maximum plane assumed by the aforementioned method (a) is often not the ground (for example, the maximum plane in the image may be the wall of the corridor), and errors in the judgment of the RANSAC or Hough Transform algorithm may occur, and, The RANSAC algorithm has the limitation that correct data (inliers) need to account for at least 50%, and the Hough Transform algorithm is also quite time-consuming; the aforementioned method (b) may also occur in the image with the largest Z value and the pixel set that conforms to the C curve, which is not It's the situation on the ground.
再者,無論利用何種方法偵測影像中的平面,在深度感測器(如深度攝影機)擷取深度影像後,依照點雲庫(Point Cloud Library,PCL)的習知作法,皆需將深度感測器取得的每個像素(pixel),先後與一相機投影反矩陣(inverse camera matrix)及一深度值作矩陣乘法運算,以轉換成點雲(Point Cloud)座標系中的多個三維座標,即如本項關係式所示: ,其中, 為點雲座標系中的三維座標, 為深度值, 為相機投影反矩陣,而 通常為一內部參數(內部參數為深度感測器的固有性質參數,主要有關於相機座標與影像座標間的轉換關係), 為深度影像於中每個像素的影像座標(其處於影像座標系);其後,再令此些三維座標的特徵點集合以點雲的型態呈現,接著,再以前述(a)或(b)等方法偵測點雲影像中的平面,但前述對每個像素均作矩陣乘法的方式,計算量相當龐大而有不佳的計算效能。 Furthermore, no matter what method is used to detect the plane in the image, after the depth sensor (such as a depth camera) captures the depth image, according to the conventional practice of the Point Cloud Library (PCL), it is necessary to Each pixel (pixel) obtained by the depth sensor is sequentially multiplied with an inverse camera matrix and a depth value to be converted into multiple three dimensions in the point cloud coordinate system Coordinates, as shown in this relationship: ,among them, Is the three-dimensional coordinate in the point cloud coordinate system, Is the depth value, For the camera projection inverse matrix, and Usually an internal parameter (the internal parameter is the inherent property parameter of the depth sensor, mainly related to the conversion relationship between the camera coordinate and the image coordinate), Is the image coordinate of each pixel in the depth image (which is in the image coordinate system); then, the feature point set of these three-dimensional coordinates is presented in the form of a point cloud, and then the above (a) or ( b) Other methods to detect the plane in the point cloud image, but the aforementioned method of matrix multiplication for each pixel requires a huge amount of calculation and poor calculation performance.
綜上可知,習知偵測三維空間中平面的作法,針對不同的平面類型(例如地面、牆面等平面),須先作強烈假設而可能有平面類型誤判的問題,同時也有計算效能不佳的缺點,依此,如何提出一種可更準確偵測平面、更節省計算資源的「平面偵測系統及偵測方法」,乃有待解決之問題。In summary, the conventional method of detecting planes in three-dimensional space requires strong assumptions for different plane types (such as ground, wall, etc.), which may lead to misjudgment of plane types and poor calculation performance. Based on this, how to propose a "plane detection system and detection method" that can detect planes more accurately and save computing resources is a problem to be solved.
為達上述目的,本發明提出一種平面動態偵測系統,包含:一慣性感測器、一深度相機及一運算裝置,其中,慣性感測器包含一加速度計及一陀螺儀;深度相機可持續擷取一深度影像,以持續輸入深度相機於一觀視範圍內對於一或多個實體物件的一深度影像座標及一深度值;運算裝置分別耦接於慣性感測器及深度相機,運算裝置具有一運動狀態判斷單元及一平面偵測單元,運動狀態判斷單元供以持續判斷慣性感測器所取得的一加速度資訊及一角速度資訊是否超出一閾值,並且,若未超出閾值,平面偵測單元可依據加速度資訊、深度影像座標、深度值及一內部參數矩陣計算出一法向量及一距離常數,並以法向量及距離常數,初始化或持續更新慣性感測器於穩定狀態時,實體物件於一相機座標系中的一平面方程式;反之,若已超出閾值,平面偵測單元可依據加速度資訊的一重力加速度,執行一視覺慣性里程計演算法,以求得深度相機的一位姿資訊,並基於位姿資訊的一旋轉矩陣及一位移資訊,持續修正慣性感測器於快速移動時的平面方程式,而平面方程式的意涵,即位於一平面上的任意點及垂直於該平面的法線,可唯一定義出三維空間中的該平面。To achieve the above objective, the present invention proposes a planar motion detection system, which includes: an inertial sensor, a depth camera, and an arithmetic device, wherein the inertial sensor includes an accelerometer and a gyroscope; the depth camera is sustainable Capture a depth image to continuously input a depth image coordinate and a depth value of one or more physical objects within a viewing range of the depth camera; the computing device is respectively coupled to the inertial sensor and the depth camera, and the computing device It has a motion state determination unit and a plane detection unit. The motion state determination unit is used to continuously determine whether an acceleration information and an angular velocity information obtained by the inertial sensor exceed a threshold value, and if the threshold value is not exceeded, plane detection The unit can calculate a normal vector and a distance constant based on acceleration information, depth image coordinates, depth values and an internal parameter matrix, and use the normal vector and distance constant to initialize or continuously update the physical object when the inertial sensor is in a stable state A plane equation in a camera coordinate system; otherwise, if the threshold is exceeded, the plane detection unit can execute a visual inertial odometry algorithm based on a gravitational acceleration of the acceleration information to obtain the position information of the depth camera , And based on a rotation matrix and a displacement information of the pose information, continue to modify the plane equation of the inertial sensor during rapid movement. The meaning of the plane equation is that any point on a plane and the perpendicular to the plane The normal can uniquely define the plane in the three-dimensional space.
為達上述目的,本發明亦提出一種平面動態偵測方法,包含: (1) 一偵測慣性數據步驟:一慣性感測器持續取得一加速度資訊及一角速度資訊等慣性數據; (2) 一判斷運動狀態步驟:一運算裝置持續判斷慣性感測器所取得的一加速度資訊及一角速度資訊是否超出一閾值,以判斷慣性感測器的運動狀態; (3) 一第一更新平面方程式步驟:若未超出閾值,運算裝置依據加速度資訊、深度影像座標、深度值及一內部參數矩陣計算出一法向量及一距離常數,並以法向量及距離常數,初始化或持續更新慣性感測器於穩定狀態時,實體物件於一相機座標系中的一平面方程式;以及 (4) 一第二更新平面方程式步驟:若已超出閾值,運算裝置依據加速度資訊的一重力加速度,執行一視覺慣性里程計演算法,以求得深度相機的一位姿資訊,並基於位姿資訊的一旋轉矩陣及一位移資訊,持續修正慣性感測器於快速移動時的平面方程式。 To achieve the above objective, the present invention also provides a planar motion detection method, including: (1) A step of detecting inertial data: an inertial sensor continuously obtains inertial data such as acceleration information and angular velocity information; (2) A step of judging the movement state: an arithmetic device continuously judges whether an acceleration information and an angular velocity information obtained by the inertial sensor exceed a threshold to judge the movement state of the inertial sensor; (3) A first step of updating the plane equation: if the threshold is not exceeded, the computing device calculates a normal vector and a distance constant based on the acceleration information, depth image coordinates, depth value and an internal parameter matrix, and uses the normal vector and distance constant , To initialize or continuously update a plane equation of the physical object in a camera coordinate system when the inertial sensor is in a stable state; and (4) A second step of updating the plane equation: if the threshold is exceeded, the computing device executes a visual inertial odometry algorithm based on a gravitational acceleration of the acceleration information to obtain the position information of the depth camera based on the position and position A rotation matrix and a displacement information of the information continuously modify the plane equation of the inertial sensor when it moves quickly.
為使 貴審查委員得以清楚了解本發明之目的、技術特徵及其實施後之功效,茲以下列說明搭配圖示進行說明,敬請參閱。In order for your reviewer to have a clear understanding of the purpose, technical features and effects of the present invention after implementation, the following descriptions and illustrations are used for illustration, please refer to it.
請參閱「第1圖」,其為本發明之系統架構圖,本發明提出一種平面動態偵測系統1,主要包含一慣性感測器10、一深度相機20及一運算裝置30,其中:
(1) 慣性感測器(Inertial Measurement Unit, IMU)10包含一加速度計(accelerometer/G-Seosor)101及一陀螺儀(Gyroscope)102,可持續取得的一加速度資訊及一角速度資訊;
(2) 深度相機20可持續擷取一深度影像,以持續輸入深度相機20於一觀視範圍內對於一或多個實體物件的一深度影像座標及一深度值,並且,深度相機20可被組態為採用一飛行時間法方案(Time of Flight,TOF)、一結構光光案(Structured Light)或一雙目視覺方案(Stereo Visual)量測出前述實體物件之深度的深度感測器,其中,飛行時間法方案係指深度相機20可作為一ToF相機,並利用發光二極體(LED)或雷射二極體(Laser Diode,LD)發射出紅外光,當照射到實體物件的物體表面的光反射回來後,由於光速為已知,故可藉此利用一個紅外光影像感測器,來量測實體物件於不同深度的位置將光線反射回來的時間,進而能推算出實體物件於不同位置的深度及實體物件的深度影像;結構光方案係指深度相機20可利用雷射二極體(Laser Diode,LD)或數位光源處理器(DLP)打出不同的光線圖形,並透過特定光柵繞射至實體物件的物體表面上,進而形成光斑圖案(Pattern),而由於實體物件於不同深度的位置所反射回來的光斑圖案會發生扭曲,故當反射回來的光線進入紅外光影像感測器後,即可反推實體物件的立體結構及其深度影像;雙目視覺方案指深度相機20可作為一雙目相機(stereo camera),並利用至少兩個攝像鏡頭拍攝實體物件及深度相機20所產生的視差(disparity),透過三角測量(Triangulation)原理量測出實體物件的三維立體資訊(深度影像);
(3) 運算裝置30分別耦接於慣性感測器10及深度相機20,並具有一運動狀態判斷單元301及一平面偵測單元302,運動狀態判斷單元301及平面偵測單元302通訊連接,運動狀態判斷單元301被組態為可持續判斷慣性感測器10所取得的加速度資訊及角速度資訊是否超出一閾值(threshold),以判斷慣性感測器10本身或其所搭載裝置的運動狀態,值得注意的是,運算裝置30可至少具有一處理器(圖中未繪示,例如CPU、MCU),其供以運行運算裝置30,並具備邏輯運算、暫存運算結果、保存執行指令位置等功能,另外,運動狀態判斷單元301及平面偵測單元302本身可運行於一平面動態裝置(圖中未繪示,例如一頭戴式顯示器,且頭戴式顯示器可為VR頭盔、MR頭盔等頭戴式顯示器)、一主機(Host)、一實體伺服器或一虛擬化伺服器(VM)的運算裝置30,惟均不以此為限;
(4) 承上,若當下未超出閾值,平面偵測單元302被組態為可依據加速度資訊、深度影像座標(Pixel Domain)、深度值(depth value)及一內部參數矩陣(intrinsic parameter matrix)計算出一法向量(normal vector)及一距離常數(d值),並以法向量及距離常數(其位處於影像座標系),初始化或持續更新慣性感測器10於穩定狀態時,實體物件於一相機座標系(camera coordinate system)中的一平面方程式(3D plane equation),而平面方程式的意涵,即位於一平面上的任意點及垂直於該平面的法線,可唯一定義出三維空間中的該平面;
(5) 反之,若當下已超出閾值,則平面偵測單元302被組態為可依據加速度資訊中的一重力加速度,執行基於濾波(filter-based)或基於優化(optimization-based)的一視覺慣性里程計(visual inertial odometry,VIO)演算法,以求得深度相機20的一位姿資訊,並基於位姿資訊的一旋轉矩陣(orientation matrix)及一位移資訊(translation),持續修正慣性感測器10於快速移動時的平面方程式;
(6) 另,前述所稱的影像座標是為了描述成像過程中,實體物件從相機座標系到影像座標系的投影透射關係而引入,是我們真正從深度相機20內讀取到的影像所在的座標系,單位為像素,而前述所稱的相機座標就是以深度相機20為原點建立的座標系,是為了從深度相機20的角度描述物體位置而定義。
Please refer to "Figure 1", which is a system architecture diagram of the present invention. The present invention proposes a planar
請繼續參閱「第1圖」,本發明在一較佳實施例中,運算裝置30的平面偵測單元302亦可對實體物件的深度影像座標與深度值執行一內積運算,以持續生成實體物件於一影像座標系的一三維座標,並以前述的三維座標與內部參數矩陣演算出平面方程式。Please continue to refer to "Figure 1". In a preferred embodiment of the present invention, the
請繼續參閱「第1圖」,本發明在一較佳實施例中,運算裝置30的平面偵測單元302亦可對前述的法向量執行一疊代最佳化(iterative optimization)演算法或一高斯牛頓(gauss newton)演算法求得一最佳法向量及其對應的距離常數(d值),並以最佳法向量取代前述的法向量演算出更臻精確的平面方程式。Please continue to refer to "Figure 1". In a preferred embodiment of the present invention, the
請參閱「第2圖」至「第3圖」,其分別為本發明的平面動態偵測方法流程圖(一)、(二),並請搭配參閱「第1圖」,本發明提出一種平面動態偵測方法S,可包括以下步驟:
(1)擷取影像步驟(步驟S10):一深度相機20持續擷取一深度影像,以持續輸入深度相機20於一觀視範圍內對於一或多個實體物件的一深度影像座標及一深度值;
(2)偵測慣性數據步驟(步驟S20):一慣性感測器10持續取得一加速度資訊及一角速度資訊等慣性數據;
(3)判斷運動狀態步驟(步驟S30):一運算裝置30持續判斷慣性感測器10所取得的一加速度資訊及一角速度資訊是否超出一閾值,以判斷慣性感測器10本身或其所搭載裝置的運動狀態;
(4)第一更新平面方程式步驟(步驟S40):承步驟S30,若未超出閾值,運算裝置30可依據加速度資訊、深度影像座標、深度值及一內部參數矩陣計算出一法向量及一距離常數(其對應於影像座標系),並以法向量及距離常數,初始化或持續更新慣性感測器10於穩定狀態時,實體物件於一相機座標系中的一平面方程式;
(5)第二更新平面方程式步驟(步驟S50):承步驟S30,若已超出閾值,運算裝置30依據加速度資訊的一重力加速度,執行一視覺慣性里程計演算法,以求得深度相機20的一位姿資訊,並基於位姿資訊的一旋轉矩陣及一位移資訊,持續修正慣性感測器10於快速移動時的平面方程式。
Please refer to "Figure 2" to "Figure 3", which are the flow diagrams (1) and (2) of the planar motion detection method of the present invention. Please also refer to "Figure 1". The present invention proposes a planar motion detection method. The motion detection method S may include the following steps: (1) Image capturing step (Step S10): A
承上,請繼續參閱「第2圖」至「第3圖」,並請搭配參閱「第1圖」,步驟S40執行時,若以欲偵測的平面類型為地面為例,且慣性感測器10的慣性數據未超出閾值,也就是慣性感測器10本身或其搭載裝置係處於穩定狀態時(例如靜止),則慣性感測器10僅會讀取到靜止加速度值g(gravity force direction),而其反方向為實體物件之平面方程式於相機座標的法向量n,關係式可參照如下:
(1)慣性感測器10的靜止加速度值:g=9.8m/s
2或10m/s
2 (2)平面方程式於相機座標的法向量n=-g=
(3)依此,深度影像中的實體物件(地面)於影像座標下的法向量
可表示為:
Continuing, please continue to refer to "Picture 2" to "Picture 3", and please refer to "
承上,請繼續參閱「第2圖」至「第3圖」,並請搭配參閱「第1圖」,步驟S50執行時,若以欲偵測的平面類型為地面為例,由於當慣性感測器10處於劇烈或快速運動的情況,已無法以加速度計101的讀數來預估平面方程式的法向量,故前述的步驟S50於執行時,可利用例如基於濾波或基於優化的VIO演算法來更新實體物件(地面)的平面方程式,假設VIO預估的深度相機20的相對位姿(Relative Pose Motion)是
,並假設更新前的平面方程式為
,則之後的平面方程式得依以下關係式更新,但以下僅為舉例,並不以此為限:
Continuing from above, please continue to refer to "Picture 2" to "Picture 3", and please refer to "
另,請繼續參閱「第2圖」至「第3圖」,並請搭配參閱「第1圖」,本發明在一較佳實施例中,若系統以欲偵測的平面類型為地面為目標,由於步驟S40執行時,即便運動狀態判斷單元301判斷慣性感測器10處於穩定狀態,慣性感測器10本身或其所搭載裝置也可能並非完全靜止,此外,也有實體物件(地面本身)有些傾斜的狀況,故在前述的步驟S40執行時,運算裝置30可進一步對法向量執行一疊代最佳化演算法或一高斯牛頓演算法(例如gauss newton least square),以求得一最佳法向量
及其對應的距離常數(
值),並以最佳法向量
取代法向量
而演算出平面方程式,更具體而言,運算裝置30的平面偵測單元302演算最佳法向量
的公式可參照如下,但以下僅為舉例,並不以此為限:
(1)首先,將深度影像中的深度值超過一定數值
的像素予以排除,再以前述提及的法向量
(此處暫稱法向量
,其對應於影像座標系)排除後的n個深度影像座標,算出對應的
值,如下關係式所示:
(2)接著,假設實體物件(地面)的
值,是在所有深度影像中法向量為
之實體物件(其它平面)中最小的,因為地面應為距離深度相機20最遠的平面,所以得依以下關係式,算出距離深度相機20最遠平面之對應的
值:
(3)其後,平面偵測單元302進一步對法向量執行一疊代最佳化演算法或一高斯牛頓演算法,求得誤差函數(Error Function,亦可稱評價函數)最小的一最佳法向量
,在此之前需先定義一誤差函數E(
)及一閾值
,如下所示:
In addition, please continue to refer to "Figure 2" to "Figure 3", and please refer to "Figure 1" together. In a preferred embodiment of the present invention, if the system targets the surface type to be detected as the ground Since step S40 is executed, even if the motion
請繼續參閱「第2圖」至「第3圖」,並請搭配參閱「第1圖」,若以欲偵測的平面類型為地面為例,則運算裝置30之平面偵測單元302計算前述法向量的演算公式可參照如下,但並不以此為限,特先陳明:
A.假設於深度影像中屬於地面部分的像素有N個;
B.假設於深度影像中的一像素點座標為(
,則:
C.第i個點於前述兩個不同座標系之三維座標的Z值相同,前述兩個三維座標於相機座標系與影像座標系的轉換關係如下:
D.所以相機座標系與影像座標系的三維影像座標,係可透過深度相機20之內部參數矩陣K相關聯,而展開上述公式可得出,第i個點於影像座標系中的深度影像座標的x、y值分別為:
E.依據平面方程式的定義,並假設實體物件所處的平面上有前述的第i個點,可知以處於相機座標系的
演算出的平面方程式為:
F.承上,相機座標系的法向量
G.依據平面方程式的定義,並假設實體物件所處的平面上有前述的第i個點,可知以處於影像座標系的
演算出的平面方程式為:
H.承上,影像座標系的法向量
I.接著,演算處於相機座標系中實體物件(平面)的法向量,假設
兩個點都在該平面上的話,會符合前述第G點的平面方程式,代入後的平面方程式分別如下:
J.將上述兩個平面方程式相減後,可得出:
K.接著,將前述第D點的第i個點於影像座標系中的深度影像座標的x、y值代入第J點的方程式可得出:
L.所以,實體物件對應到相機座標系的平面方程式的法向量為:
Please continue to refer to "Figure 2" to "Figure 3", and also refer to "Figure 1". If the type of plane to be detected is the ground as an example, the
承上,請繼續參閱「第2圖」至「第3圖」,並請搭配參閱「第1圖」,當運算裝置30演算出實體物件對應到相機座標系的平面方程式的法向量n後,接續計算d值的演算公式可參照如下,但並不以此為限,特先陳明:
M.首先,令一常數
N.將處於影像座標系的像素點
代入前述第G點的平面方程式可得出:
O.將前述第D點「第i個點於影像座標系中的深度影像座標的x、y值」代入前述第N點的平面方程式可得出:
P.對前述第O點之平面方程式的等式兩側均除以c:
Q.於此,可得出實體物件於相機座標中的平面方程式的d值為:
Continuing, please continue to refer to "Figure 2" to "Figure 3", and please refer to "Figure 1" together, when the
另,請繼續參閱「第2圖」至「第3圖」,並請搭配參閱「第1圖」,本發明在一較佳實施例中,在前述的步驟S30執行前,可先執行一取得三維座標步驟(步驟S25):運算裝置30對實體物件的深度影像座標與深度值執行一內積運算,以持續生成實體物件於一影像座標系的一三維座標,依此,可於步驟S40或步驟S50執行時,以前述的三維座標、內部參數矩陣與加速度資訊演算出前述的法向量及距離常數,進而運算出實體物件的平面方程式,更具體而言,生成前述三維座標的演算公式可參照:
,其中,
為處於影像座標系的三維座標,Z為深度值,
則為深度影像座標(處於影像座標系),藉此,相較於習知點雲庫(PCL)皆需將深度相機20所取得的每個像素,先後與一相機投影反矩陣(即前述的K)及一深度值作矩陣乘法運算,以轉換成點雲座標系中的多個三維座標的作法,本實施例可省去像素、深度值與相機投影反矩陣作矩陣運算的步驟,而直接以前述的三維座標進行實體物件(平面)的偵測,而能達成節省運算量的有益功效,同時能省去從深度影像轉換至點雲的轉換時間。
In addition, please continue to refer to "Figure 2" to "Figure 3", and please refer to "Figure 1" in conjunction. In a preferred embodiment of the present invention, before the aforementioned step S30 is executed, a first execution can be obtained. Three-dimensional coordinate step (step S25): the
請參閱「第4圖」,其為本發明之另一較佳實施例之系統架構圖,本實施例與「第1圖」至「第3圖」所揭技術類同,主要差異在於,本實施例的平面動態偵測系統1更可包括一彩色相機40(例如RGB相機),其分別耦接於深度相機20及運算裝置30,供以持續擷取實體物件的一彩色影像,以供運算裝置30於步驟S10(擷取影像步驟)執行時,可確立實體物件之深度影像座標及一彩色影像座標之間的對應關係,以提升平面偵測之準確性,另,本實施例的彩色相機40亦可與深度相機20構成一RGB-D相機,即如本圖所示,且本實施例的深度相機20可為雙目相機,但均不以此為限。Please refer to "Figure 4", which is a system architecture diagram of another preferred embodiment of the present invention. This embodiment is similar to the technology disclosed in "Figure 1" to "Figure 3". The main difference lies in the The planar
綜上可知,本發明據以實施後,由於可解決習知偵測三維空間中平面時,針對不同的平面類型須作強烈假設而可能有平面誤判的問題,同時能改善習知平面偵測方法之計算效能不佳的缺點,而能達成更為準確偵測平面、更節省計算資源的有益功效。In summary, after the present invention is implemented, it can solve the problem that when conventionally detecting planes in three-dimensional space, strong assumptions must be made for different plane types, which may cause plane misjudgment, and at the same time, it can improve the conventional plane detection method. The disadvantage of poor computing performance can achieve the beneficial effects of more accurate detection of the plane and more saving of computing resources.
以上所述者,僅為本發明之較佳之實施例而已,並非用以限定本發明實施之範圍;任何熟習此技藝者,在不脫離本發明之精神與範圍下所作之均等變化與修飾,皆應涵蓋於本發明之專利範圍內。The above are only preferred embodiments of the present invention, and are not intended to limit the scope of implementation of the present invention; anyone who is familiar with this technique can make equal changes and modifications without departing from the spirit and scope of the present invention. Should be covered within the scope of the patent of the present invention.
綜上所述,本發明係具有「產業利用性」、「新穎性」與「進步性」等專利要件;申請人爰依專利法之規定,向 鈞局提起發明專利之申請。To sum up, the present invention has patent requirements such as "industrial applicability", "novelty" and "advancedness"; the applicant filed an application for a patent for invention with the Bureau in accordance with the provisions of the Patent Law.
1 平面動態偵測系統
10 慣性感測器 101 加速度計
102 陀螺儀
20 深度相機
30 運算裝置 301 運動狀態判斷單元
302 平面偵測單元
40 彩色相機
S 平面動態偵測方法
S10 擷取影像步驟
S20 偵測慣性數據步驟
S25 取得三維座標步驟
S30 判斷運動狀態步驟
S40 第一更新平面方程式步驟
S50 第二更新平面方程式步驟
1 Plane
第1圖,為本發明之系統架構圖。 第2圖,為本發明的平面偵測方法流程圖(一)。 第3圖,為本發明的平面偵測方法流程圖(二)。 第4圖,為本發明於另一較佳實施例之系統架構圖。 Figure 1 is a system architecture diagram of the present invention. Figure 2 is a flow chart (1) of the plane detection method of the present invention. Figure 3 is a flowchart (2) of the plane detection method of the present invention. Figure 4 is a system architecture diagram of another preferred embodiment of the present invention.
1 平面動態偵測系統
10 慣性感測器 101 加速度計
102 陀螺儀
20 深度相機
30 運算裝置 301 運動狀態判斷單元
302 平面偵測單元
1 Plane
Claims (5)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
TW108139370A TWI730482B (en) | 2019-10-31 | 2019-10-31 | Plane dynamic detection system and detection method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
TW108139370A TWI730482B (en) | 2019-10-31 | 2019-10-31 | Plane dynamic detection system and detection method |
Publications (2)
Publication Number | Publication Date |
---|---|
TW202119359A TW202119359A (en) | 2021-05-16 |
TWI730482B true TWI730482B (en) | 2021-06-11 |
Family
ID=77020967
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
TW108139370A TWI730482B (en) | 2019-10-31 | 2019-10-31 | Plane dynamic detection system and detection method |
Country Status (1)
Country | Link |
---|---|
TW (1) | TWI730482B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20230394782A1 (en) * | 2022-06-07 | 2023-12-07 | Htc Corporation | Method for determining floor plane and host |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015138822A1 (en) * | 2014-03-14 | 2015-09-17 | Qualcomm Incorporated | Sensor-based camera motion detection for unconstrained slam |
TW201915445A (en) * | 2017-10-13 | 2019-04-16 | 緯創資通股份有限公司 | Locating method, locator, and locating system for head-mounted display |
US20190187783A1 (en) * | 2017-12-18 | 2019-06-20 | Alt Llc | Method and system for optical-inertial tracking of a moving object |
CN110246177A (en) * | 2019-06-25 | 2019-09-17 | 上海大学 | Automatic wave measuring method based on vision |
-
2019
- 2019-10-31 TW TW108139370A patent/TWI730482B/en active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015138822A1 (en) * | 2014-03-14 | 2015-09-17 | Qualcomm Incorporated | Sensor-based camera motion detection for unconstrained slam |
TW201915445A (en) * | 2017-10-13 | 2019-04-16 | 緯創資通股份有限公司 | Locating method, locator, and locating system for head-mounted display |
US20190187783A1 (en) * | 2017-12-18 | 2019-06-20 | Alt Llc | Method and system for optical-inertial tracking of a moving object |
CN110246177A (en) * | 2019-06-25 | 2019-09-17 | 上海大学 | Automatic wave measuring method based on vision |
Also Published As
Publication number | Publication date |
---|---|
TW202119359A (en) | 2021-05-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20210190497A1 (en) | Simultaneous location and mapping (slam) using dual event cameras | |
US8711206B2 (en) | Mobile camera localization using depth maps | |
Jordt-Sedlazeck et al. | Refractive calibration of underwater cameras | |
US9679384B2 (en) | Method of detecting and describing features from an intensity image | |
KR102455632B1 (en) | Mehtod and apparatus for stereo matching | |
EP2984627A1 (en) | Multi-sensor camera recalibration | |
US20130147785A1 (en) | Three-dimensional texture reprojection | |
WO2022188334A1 (en) | Positioning initialization method and apparatus, device, storage medium, and program product | |
JP5976089B2 (en) | Position / orientation measuring apparatus, position / orientation measuring method, and program | |
TWI730482B (en) | Plane dynamic detection system and detection method | |
CN116051650A (en) | Laser radar and camera combined external parameter calibration method and device | |
TWI822423B (en) | Computing apparatus and model generation method | |
WO2010089938A1 (en) | Rotation estimation device, rotation estimation method, and storage medium | |
CN112750205B (en) | Plane dynamic detection system and detection method | |
KR20240015464A (en) | Line-feature-based SLAM system using vanishing points | |
TWM594152U (en) | Planar dynamic detection system | |
Boas et al. | Relative Pose Improvement of Sphere based RGB-D Calibration. | |
KR102728513B1 (en) | Slam method and slam system using dual event camaer | |
JP7255709B2 (en) | Estimation method, estimation device and program | |
Ramli et al. | Enhancement of Depth Value Approximation for 3D Image-Based Modelling using Noise Filtering and Inverse Perspective Mapping Techniques for Complex Object | |
OK Rahmat et al. | Enhancement of depth value approximation using noise filtering and inverse perspective mapping techniques for image based modelling | |
WO2023141491A1 (en) | Sensor calibration system | |
CN113706596A (en) | Method for densely constructing image based on monocular camera | |
Burschka | Monocular navigation in large scale dynamic environments | |
JP5510837B2 (en) | Stereo camera placement method and system |