TWI553531B - Optical touch device and method for calculating coordinate of touch point - Google Patents
Optical touch device and method for calculating coordinate of touch point Download PDFInfo
- Publication number
- TWI553531B TWI553531B TW102143861A TW102143861A TWI553531B TW I553531 B TWI553531 B TW I553531B TW 102143861 A TW102143861 A TW 102143861A TW 102143861 A TW102143861 A TW 102143861A TW I553531 B TWI553531 B TW I553531B
- Authority
- TW
- Taiwan
- Prior art keywords
- sensing unit
- image
- touch
- image sensing
- coordinate
- Prior art date
Links
Landscapes
- Position Input By Displaying (AREA)
Description
本發明關於一種光學觸控裝置及觸控點座標之計算方法,尤指一種可避免觸控軌跡產生偏移之光學觸控裝置及觸控點座標之計算方法。 The invention relates to an optical touch device and a calculation method of a touch point coordinate, in particular to an optical touch device and a touch point coordinate calculation method capable of avoiding offset of a touch track.
由於目前的消費性電子產品皆以輕、薄、短、小為設計之方向,因此,產品上已無空間容納如滑鼠、鍵盤等傳統輸入之工具。隨著光學觸控裝置技術的進步,在各種消費性電子產品中,例如顯示器、一體機(All in One)、行動電話、個人數位助理(Personal Digital Assistant,PDA)等產品已廣泛地使用光學觸控裝置作為其資料輸入之工具。目前,光學觸控裝置相較其他觸控技術,如電阻式、電容式、超音波式或投影影像式等,特別是在大尺寸的觸控顯示領域,具有更低成本與更易達成的優勢。 Since current consumer electronic products are designed in a light, thin, short, and small direction, there is no room for traditional input tools such as a mouse and a keyboard. With the advancement of optical touch device technology, optical touch has been widely used in various consumer electronic products, such as displays, All in One, mobile phones, and Personal Digital Assistant (PDA). The control device is used as a tool for inputting data. At present, the optical touch device has advantages of lower cost and easier achievement than other touch technologies, such as resistive, capacitive, ultrasonic or projected image, especially in the field of large-sized touch display.
習知的光學觸控裝置利用相對設置之二影像感測單元以感測一觸控物件(例如:手指或觸控筆)於指示平面上所指示的觸控點。當影像感測單元於指示平面上感測到觸控物件,光學觸控裝置之處理單元即可計算出觸控物件所指示的觸控點之座標。然而,受限於影像感測單元的解析能力,使得光學觸控裝置無法做得太大。 The conventional optical touch device utilizes two image sensing units disposed oppositely to sense a touch point indicated by a touch object (eg, a finger or a stylus) on the indication plane. When the image sensing unit senses the touch object on the indication plane, the processing unit of the optical touch device can calculate the coordinates of the touch point indicated by the touch object. However, limited by the resolution capabilities of the image sensing unit, the optical touch device cannot be made too large.
請參閱第1圖,第1圖為先前技術之光學觸控裝置5的示意圖。如第1圖所示,光學觸控裝置5係在指示平面50中間設置四個影像感測單元52、54、56、58,其中影像感測單元52、58負責感測指示平面50右半部之觸控區域,且影像感測單元54、56負責感測指示平面50左半部之觸控區域。藉此,即可實現大尺寸之光學觸控應用。 Please refer to FIG. 1 , which is a schematic diagram of a prior art optical touch device 5 . As shown in FIG. 1 , the optical touch device 5 is provided with four image sensing units 52 , 54 , 56 , 58 in the middle of the indication plane 50 , wherein the image sensing units 52 , 58 are responsible for sensing the right half of the indication plane 50 . The touch sensing area, and the image sensing units 54, 56 are responsible for sensing the touch area of the left half of the indication plane 50. In this way, a large-sized optical touch application can be realized.
如第1圖所示,影像感測單元52、54、56、58之間定義一中央觸控區域500,其中影像感測單元52、58之連線為中央觸控區域500之一左邊界502,且影像感測單元54、56之連線為中央觸控區域500之一右邊界504。一般而言,在中央觸控區域500上的觸控操作可利用影像感測單元52、58或影像感測單元54、56來進行感測。如第1圖所示,觸控軌跡60係由影像感測單元54、56感測得到,且觸控軌跡62係由影像感測單元52、58感測得到。對於影像感測單元54、56而言,觸控軌跡60在接近右邊界504時,會開始產生偏移;對於影像感測單元52、58而言,觸控軌跡62在接近左邊界502時,會開始產生偏移。因此,觸控軌跡60與觸控軌跡62在中央觸控區域500中的轉換與銜接便會有不平滑的現象發生,進而影響使用上的觀感。 As shown in FIG. 1 , a central touch area 500 is defined between the image sensing units 52 , 54 , 56 , 58 , wherein the connection of the image sensing units 52 , 58 is one of the left borders 502 of the central touch area 500 . The connection between the image sensing units 54, 56 is a right border 504 of the central touch area 500. In general, the touch operation on the central touch area 500 can be sensed by the image sensing units 52, 58 or the image sensing units 54, 56. As shown in FIG. 1 , the touch track 60 is sensed by the image sensing units 54 , 56 , and the touch track 62 is sensed by the image sensing units 52 , 58 . For the image sensing units 54 and 56, the touch track 60 starts to shift when approaching the right border 504; for the image sensing units 52, 58, when the touch track 62 is near the left border 502, Will start to produce an offset. Therefore, the transition and connection of the touch track 60 and the touch track 62 in the central touch area 500 may occur, which may cause an unsmooth phenomenon, thereby affecting the look and feel in use.
本發明提供一種可避免觸控軌跡產生偏移之光學觸控裝置及觸控點座標之計算方法,以解決上述之問題。 The invention provides an optical touch device capable of avoiding offset of a touch track and a calculation method of the touch point coordinates to solve the above problems.
本發明之申請專利範圍揭露一種光學觸控裝置,其包含一指示平面,具有一第一側邊以及一第二側邊,該第一側邊與該第二側邊相對;一第一影像感測單元以及一第二影像感測單元,間隔地設置於該第一側邊;一第三影像感測單元以及一第四影像感測單元,間隔地設置於該第二側邊,該第一影像感測單元與該第四影像感測單元相對,該第二影像感測單元與該第三影像感測單元相對,該第一影像感測單元、該第二影像感測單元、該第三影像感測單元與該第四影像感測單元之間定義一中央觸控區域;以及一處理單元,電性連接於該第一影像感測單元、該第二影像感測單元、該第三影像感測單元與該第四影像感測單元;其中,當一觸控手勢於該中央觸控區域上進行操作時,該第一影像感測單元感測一第一影像,該第二影像感測單元感測一第二影像,該第三影像感測單元感測一第三影像,且該第四影像感測單元感測一第四影像;該處理單元根據該第一影像與該第四影像計算一觸控點之一第一座標,根據該第二影像與該第三影像計算該觸控點之一第二座標,且 以一權重整合該第一座標與該第二座標,以計算該觸控點之一輸出座標。 An optical touch device includes an indicating plane having a first side and a second side, the first side being opposite to the second side; a first image sense The measuring unit and the second image sensing unit are disposed at intervals on the first side; a third image sensing unit and a fourth image sensing unit are spaced apart from each other on the second side, the first The image sensing unit is opposite to the fourth image sensing unit, and the second image sensing unit is opposite to the third image sensing unit, the first image sensing unit, the second image sensing unit, and the third A central touch area is defined between the image sensing unit and the fourth image sensing unit; and a processing unit is electrically connected to the first image sensing unit, the second image sensing unit, and the third image The first image sensing unit senses a first image, and the second image sensing unit is configured to sense a touch image on the central touch area. The unit senses a second image, the first The image sensing unit senses a third image, and the fourth image sensing unit senses a fourth image; the processing unit calculates a first coordinate of a touch point according to the first image and the fourth image, Calculating a second coordinate of the touch point according to the second image and the third image, and The first coordinate and the second coordinate are integrated by a weight to calculate an output coordinate of the touch point.
本發明之申請專利範圍另揭露該處理單元藉由下列公式計算該觸控點之該輸出座標:X T =X 1×W+X 2×(1-W);以及Y T =Y 1×W+Y 2×(1-W);其中,(XT,YT)表示該輸出座標,(X1,Y1)表示該第一座標,(X2,Y2)表示該第二座標,且W表示該權重。 The patent application scope of the present invention further discloses that the processing unit calculates the output coordinate of the touch point by the following formula: X T = X 1 × W + X 2 × (1- W ); and Y T = Y 1 × W + Y 2 ×(1- W ); where (X T , Y T ) represents the output coordinate, (X 1 , Y 1 ) represents the first coordinate, and (X 2 , Y 2 ) represents the second coordinate, And W represents the weight.
本發明之申請專利範圍另揭露該第一影像感測單元與該第四影像感測單元之連線為該中央觸控區域之一第一邊界,該第二影像感測單元與該第三影像感測單元之連線為該中央觸控區域之一第二邊界,該中央觸控區域中定義一第一臨界線以及一第二臨界線,該第一臨界線與該第一邊界相距一門檻距離,該第二臨界線與該第二邊界相距該門檻距離;當該觸控點位於該第一邊界與該第一臨界線之間時,該權重等於0;當該觸控點位於該第二邊界與該第二臨界線之間時,該權重等於1;當該觸控點位於該第一臨界線與該第二臨界線之間時,該權重等於,d表示該觸控點與該第一邊界間之距離,T表示該門檻距離,D表示該第一邊界與該第二邊界間之距離。 The connection between the first image sensing unit and the fourth image sensing unit is the first boundary of the central touch area, the second image sensing unit and the third image. The connection of the sensing unit is a second boundary of the central touch area, wherein the first touch line defines a first critical line and a second critical line, and the first critical line is at a threshold from the first boundary The second critical line is spaced from the second boundary by the threshold distance; when the touch point is between the first boundary and the first critical line, the weight is equal to 0; when the touch point is located at the first When the boundary between the second boundary and the second critical line is equal to 1; when the touch point is between the first critical line and the second critical line, the weight is equal to , d represents the distance between the touch point and the first boundary, T represents the threshold distance, and D represents the distance between the first boundary and the second boundary.
本發明之申請專利範圍另揭露當該處理單元根據該第一影像與該第四影像計算出N個觸控點,且根據該第二影像與該第三影像計算出M個觸控點時,該處理單元判斷N是否大於M,N與M皆為正整數;當N大於M時,該處理單元計算並輸出該N個觸控點之座標;當N小於M時,該處理單元計算並輸出該M個觸控點之座標。 The method of the present invention further discloses that when the processing unit calculates N touch points according to the first image and the fourth image, and calculates M touch points according to the second image and the third image, The processing unit determines whether N is greater than M, and N and M are both positive integers; when N is greater than M, the processing unit calculates and outputs coordinates of the N touch points; when N is less than M, the processing unit calculates and outputs The coordinates of the M touch points.
本發明之申請專利範圍另揭露當N等於M時,該處理單元將該N個觸控點與該M個觸控點兩兩配對,以得到N對觸控點,該處理單元以該權重整合該N對觸控點之座標,以計算該N對觸控點之N個輸出座標。 The application scope of the present invention further discloses that when N is equal to M, the processing unit pairs the N touch points and the M touch points to obtain N pairs of touch points, and the processing unit integrates the weights. The coordinates of the N pairs of touch points to calculate N output coordinates of the N pairs of touch points.
本發明之申請專利範圍另揭露一種觸控點座標之計算方法,適用於一光學觸控裝置,該光學觸控裝置包含一指示平面、一第一影像感測單元、一第二影像感測單元、一第三影像感測單元以及一第四影像感測單元,該指示平面具有一第一側邊以及一第二側邊,該第一側邊與該第二側邊相對,該 第一影像感測單元與該第二影像感測單元間隔地設置於該第一側邊,該第三影像感測單元與該第四影像感測單元間隔地設置於該第二側邊,該第一影像感測單元與該第四影像感測單元相對,該第二影像感測單元與該第三影像感測單元相對,該第一影像感測單元、該第二影像感測單元、該第三影像感測單元與該第四影像感測單元之間定義一中央觸控區域,該觸控點座標之計算方法包含當一觸控手勢於該中央觸控區域上進行操作時,該第一影像感測單元感測一第一影像,該第二影像感測單元感測一第二影像,該第三影像感測單元感測一第三影像,且該第四影像感測單元感測一第四影像;根據該第一影像與該第四影像計算一觸控點之一第一座標;根據該第二影像與該第三影像計算該觸控點之一第二座標;以及以一權重整合該第一座標與該第二座標,以計算該觸控點之一輸出座標。 The method of the present invention further discloses a method for calculating a touch point coordinate, which is applicable to an optical touch device, the optical touch device includes an indication plane, a first image sensing unit, and a second image sensing unit. a third image sensing unit and a fourth image sensing unit, the indicating plane having a first side and a second side, the first side being opposite to the second side, the first side The first image sensing unit is disposed on the first side of the second image sensing unit, and the third image sensing unit is disposed on the second side of the fourth image sensing unit. The first image sensing unit is opposite to the fourth image sensing unit, and the second image sensing unit is opposite to the third image sensing unit, the first image sensing unit, the second image sensing unit, and the A central touch area is defined between the third image sensing unit and the fourth image sensing unit, and the method for calculating the touch point coordinates includes: when a touch gesture is performed on the central touch area, the first An image sensing unit senses a first image, the second image sensing unit senses a second image, the third image sensing unit senses a third image, and the fourth image sensing unit senses a fourth image; calculating a first coordinate of a touch point according to the first image and the fourth image; calculating a second coordinate of the touch point according to the second image and the third image; and Weighting the first coordinate and the second coordinate to calculate The output coordinates of one touch point.
本發明之申請專利範圍另揭露該觸控點之該輸出座標藉由下列公式計算出:X T =X 1×W+X 2×(1-W);以及Y T =Y 1×W+Y 2×(1-W);其中,(XT,YT)表示該輸出座標,(X1,Y1)表示該第一座標,(X2,Y2)表示該第二座標,且W表示該權重。 The patent application scope of the present invention further discloses that the output coordinate of the touch point is calculated by the following formula: X T = X 1 × W + X 2 × (1- W ); and Y T = Y 1 × W + Y 2 ×(1- W ); where (X T , Y T ) represents the output coordinate, (X 1 , Y 1 ) represents the first coordinate, (X 2 , Y 2 ) represents the second coordinate, and W Indicates the weight.
本發明之申請專利範圍另揭露該第一影像感測單元與該第四影像感測單元之連線為該中央觸控區域之一第一邊界,該第二影像感測單元與該第三影像感測單元之連線為該中央觸控區域之一第二邊界,該中央觸控區域中定義一第一臨界線以及一第二臨界線,該第一臨界線與該第一邊界相距一門檻距離,該第二臨界線與該第二邊界相距該門檻距離,當該觸控點位於該第一邊界與該第一臨界線之間時,該權重等於0;當該觸控點位於該第二邊界與該第二臨界線之間時,該權重等於1;當該觸控點位於該第一臨界線與該第二臨界線之間時,該權重等於,d表示該觸控點與該第一邊界間之距離,T表示該門檻距離,D表示該第一邊界與該第二邊界間之距離。 The connection between the first image sensing unit and the fourth image sensing unit is the first boundary of the central touch area, the second image sensing unit and the third image. The connection of the sensing unit is a second boundary of the central touch area, wherein the first touch line defines a first critical line and a second critical line, and the first critical line is at a threshold from the first boundary The second critical line is spaced from the second boundary by the threshold distance. When the touch point is between the first boundary and the first critical line, the weight is equal to 0; when the touch point is located at the first When the boundary between the second boundary and the second critical line is equal to 1; when the touch point is between the first critical line and the second critical line, the weight is equal to , d represents the distance between the touch point and the first boundary, T represents the threshold distance, and D represents the distance between the first boundary and the second boundary.
本發明之申請專利範圍另揭露該觸控點座標之計算方法另包含根據該第一影像與該第四影像計算出N個觸控點,且根據該第二影像與該第三 影像計算出M個觸控點;判斷N是否大於M,其中N與M皆為正整數;當N大於M時,計算並輸出該N個觸控點之座標;以及當N小於M時,計算並輸出該M個觸控點之座標。 The method for calculating the touch point coordinates further includes calculating N touch points according to the first image and the fourth image, and according to the second image and the third The image calculates M touch points; determines whether N is greater than M, wherein N and M are positive integers; when N is greater than M, calculates and outputs coordinates of the N touch points; and when N is less than M, calculates And output the coordinates of the M touch points.
本發明之申請專利範圍另揭露該觸控點座標之計算方法另包含當N等於M時,將該N個觸控點與該M個觸控點兩兩配對,以得到N對觸控點;以及以該權重整合該N對觸控點之座標,以計算該N對觸控點之N個輸出座標。 The method for calculating the touch point coordinates further includes: when N is equal to M, pairing the N touch points with the M touch points to obtain N pairs of touch points; And integrating the coordinates of the N pairs of touch points with the weight to calculate N output coordinates of the N pairs of touch points.
綜上所述,本發明係利用權重整合兩組影像感測單元感測到的觸控點的兩組座標,以計算作用於中央觸控區域上之觸控點之輸出座標。藉此,本發明即可有效避免觸控軌跡在中央觸控區域產生偏移,使得觸控軌跡在中央觸控區域不會有不平滑的現象發生。 In summary, the present invention integrates two sets of coordinates of the touch points sensed by the two sets of image sensing units by using weights to calculate the output coordinates of the touch points acting on the central touch area. Therefore, the present invention can effectively prevent the touch track from being offset in the central touch area, so that the touch track does not have an unsmooth phenomenon in the central touch area.
關於本發明之優點與精神可以藉由以下的發明詳述及所附圖式得到進一步的瞭解。 The advantages and spirit of the present invention will be further understood from the following detailed description of the invention.
1、5‧‧‧光學觸控裝置 1, 5‧‧‧ optical touch device
10、50‧‧‧指示平面 10, 50‧‧‧ indicating plane
12‧‧‧第一影像感測單元 12‧‧‧First image sensing unit
14‧‧‧第二影像感測單元 14‧‧‧Second image sensing unit
16‧‧‧第三影像感測單元 16‧‧‧ Third image sensing unit
18‧‧‧第四影像感測單元 18‧‧‧Four image sensing unit
20‧‧‧處理單元 20‧‧‧Processing unit
30、32、34、36、38、40、42、42a、42b、44、44a、44b‧‧‧觸控點 30, 32, 34, 36, 38, 40, 42, 42a, 42b, 44, 44a, 44b‧‧‧ touch points
52、54、56、58‧‧‧影像感測單元 52, 54, 56, 58‧‧‧ image sensing unit
60、62‧‧‧觸控軌跡 60, 62‧‧‧ touch track
100‧‧‧第一側邊 100‧‧‧ first side
102‧‧‧第二側邊 102‧‧‧Second side
104、500‧‧‧中央觸控區域 104, 500‧‧‧ central touch area
106‧‧‧第一邊界 106‧‧‧First border
108‧‧‧第二邊界 108‧‧‧second border
110‧‧‧第一臨界線 110‧‧‧first critical line
112‧‧‧第二臨界線 112‧‧‧second critical line
114‧‧‧左觸控區域 114‧‧‧ Left touch area
116‧‧‧右觸控區域 116‧‧‧Right touch area
502‧‧‧左邊界 502‧‧‧left border
504‧‧‧右邊界 504‧‧‧Right border
I1‧‧‧第一影像 I1‧‧‧ first image
I2‧‧‧第二影像 I2‧‧‧second image
I3‧‧‧第三影像 I3‧‧‧ third image
I4‧‧‧第四影像 I4‧‧‧ fourth image
D、d、H‧‧‧距離 D, d, H‧‧‧ distance
L‧‧‧長度 L‧‧‧ length
W‧‧‧寬度 W‧‧‧Width
O‧‧‧座標原點 O‧‧‧ coordinate origin
T‧‧‧門檻距離 T‧‧‧ threshold distance
X-Y‧‧‧直角座標系統 X-Y‧‧‧ Right Angle Coordinate System
θA、θB、θC、θD‧‧‧角度 θ A , θ B , θ C , θ D ‧‧‧ angle
△θ‧‧‧角度閥值 △θ‧‧‧ angle threshold
S10-S14、S20-S30‧‧‧步驟 S10-S14, S20-S30‧‧‧ steps
第1圖為先前技術之光學觸控裝置的示意圖。 FIG. 1 is a schematic diagram of a prior art optical touch device.
第2圖為本發明一實施例之光學觸控裝置的示意圖。 FIG. 2 is a schematic diagram of an optical touch device according to an embodiment of the invention.
第3圖為第2圖中的光學觸控裝置的功能方塊圖。 Fig. 3 is a functional block diagram of the optical touch device in Fig. 2.
第4圖為一個觸控點產生於中央觸控區域上的示意圖。 Figure 4 is a schematic diagram of a touch point generated on a central touch area.
第5圖為以第二影像感測單元與第三影像感測單元說明如何設定門檻距離的示意圖。 FIG. 5 is a schematic diagram illustrating how the threshold distance is set by the second image sensing unit and the third image sensing unit.
第6圖為二個觸控點產生於中央觸控區域上的示意圖。 Figure 6 is a schematic diagram of two touch points generated on the central touch area.
第7圖為二個觸控點產生於中央觸控區域上的示意圖。 Figure 7 is a schematic diagram of two touch points generated on the central touch area.
第8圖為二個觸控點產生於中央觸控區域上的示意圖。 Figure 8 is a schematic diagram of two touch points generated on the central touch area.
第9圖為本發明一實施例之觸控點座標之計算方法的流程圖。 FIG. 9 is a flow chart of a method for calculating a touch point coordinate according to an embodiment of the present invention.
第10圖為本發明另一實施例之觸控點座標之計算方法的流程圖。 FIG. 10 is a flowchart of a method for calculating a touch point coordinate according to another embodiment of the present invention.
請參閱第2圖以及第3圖,第2圖為本發明一實施例之光學觸控裝置1的示意圖,第3圖為第2圖中的光學觸控裝置1的功能方塊圖。如第2圖與第3圖所示,光學觸控裝置1包含一指示平面10、一第一影像感測單元12、一第二影像感測單元14、一第三影像感測單元16、一第四影像感測單元18以及一處理單元20,其中處理單元20電性連接於第一影像感測單元12、第二影像感測單元14、第三影像感測單元16與第四影像感測單元18。 Please refer to FIG. 2 and FIG. 3 , FIG. 2 is a schematic diagram of an optical touch device 1 according to an embodiment of the present invention, and FIG. 3 is a functional block diagram of the optical touch device 1 in FIG. 2 . As shown in FIG. 2 and FIG. 3 , the optical touch device 1 includes an indication plane 10 , a first image sensing unit 12 , a second image sensing unit 14 , and a third image sensing unit 16 . The fourth image sensing unit 18 and the processing unit 20 are electrically connected to the first image sensing unit 12, the second image sensing unit 14, the third image sensing unit 16, and the fourth image sensing. Unit 18.
於實際應用中,指示平面10可為顯示面板(例如,液晶顯示面板)、白板、黑板、投影屏幕或其它平面,供使用者進行觸控操作;第一影像感測單元12、第二影像感測單元14、第三影像感測單元16與第四影像感測單元18可為電荷耦合元件(Charge-coupled Device,CCD)感測器或互補式金屬氧化半導體(Complementary Metal-Oxide Semiconductor,CMOS)感測器,但不以此為限;處理單元20可為具有資料運算/處理功能之處理器或控制器。於實際應用中,可在第一影像感測單元12、第二影像感測單元14、第三影像感測單元16與第四影像感測單元18旁設置發光單元(例如,發光二極體),或在指示平面10周圍設置光條(light bar),以提供一般觸控操作所需之光線。當在第一影像感測單元12、第二影像感測單元14、第三影像感測單元16與第四影像感測單元18旁設置發光單元時,可在指示平面10周圍設置反光邊框或吸光邊框,視實際應用而定。 In an actual application, the indication plane 10 can be a display panel (for example, a liquid crystal display panel), a whiteboard, a blackboard, a projection screen, or other plane for the user to perform a touch operation; the first image sensing unit 12 and the second image sense The measurement unit 14, the third image sensing unit 16 and the fourth image sensing unit 18 may be a Charge-coupled Device (CCD) sensor or a Complementary Metal-Oxide Semiconductor (CMOS). The sensor, but not limited thereto; the processing unit 20 can be a processor or controller having a data operation/processing function. In an actual application, a light emitting unit (for example, a light emitting diode) may be disposed beside the first image sensing unit 12, the second image sensing unit 14, the third image sensing unit 16, and the fourth image sensing unit 18. Or, a light bar is placed around the indicating plane 10 to provide the light required for a general touch operation. When the light emitting unit is disposed beside the first image sensing unit 12, the second image sensing unit 14, the third image sensing unit 16, and the fourth image sensing unit 18, a reflective border or light absorbing may be disposed around the indicating plane 10. Border, depending on the application.
指示平面10具有一第一側邊100以及一第二側邊102,其中第一側邊100與第二側邊102相對。第一影像感測單元12與第二影像感測單元14間隔地設置於第一側邊100,且第三影像感測單元16與第四影像感測單元18間隔地設置於第二側邊102,其中第一影像感測單元12與第四影像感測單元18相對,第二影像感測單元14與第三影像感測單元16相對,且第一影像感測單元12、第二影像感測單元14、第三影像感測單元16與第四影像感測單元18之間定義一中央觸控區域104。此外,第一影像感測單元12與第四 影像感測單元18之連線為中央觸控區域104之一第一邊界106,且第二影像感測單元14與第三影像感測單元16之連線為中央觸控區域104之一第二邊界108。 The indicator plane 10 has a first side 100 and a second side 102, wherein the first side 100 is opposite the second side 102. The first image sensing unit 12 and the second image sensing unit 14 are disposed at intervals on the first side 100 , and the third image sensing unit 16 and the fourth image sensing unit 18 are spaced apart from each other on the second side 102 . The first image sensing unit 12 is opposite to the fourth image sensing unit 18, the second image sensing unit 14 is opposite to the third image sensing unit 16, and the first image sensing unit 12 and the second image sensing unit A central touch area 104 is defined between the unit 14 , the third image sensing unit 16 and the fourth image sensing unit 18 . In addition, the first image sensing unit 12 and the fourth The connection of the image sensing unit 18 is the first boundary 106 of the central touch area 104, and the connection between the second image sensing unit 14 and the third image sensing unit 16 is the second of the central touch area 104. Boundary 108.
本發明可將X-Y直角座標系統及其座標原點O設定如第2圖所示,其中L為指示平面10之長度,W為指示平面10之寬度。第一影像感測單元12之座標可表示為(XA,YA),第二影像感測單元14之座標可表示為(XB,YB),第三影像感測單元16之座標可表示為(XC,YC),且第四影像感測單元18之座標可表示為(XD,YD)。如第2圖所示,當一觸控手勢於中央觸控區域104上進行操作而產生一觸控點30時,第一影像感測單元12可感測到關於觸控點30之角度為θA,第二影像感測單元14可感測到關於觸控點30之角度為θB,第三影像感測單元16可感測到關於觸控點30之角度為θC,且第四影像感測單元18可感測到關於觸控點30之角度為θD。需說明的是,角度θA、θB、θC、θD可由習知光學觸控技術領域之人輕易計算得到,在此不再贅述。接著,即可利用第一影像感測單元12與第四影像感測單元18進行三角定位,以由下列公式一計算出觸控點30之座標(XE,YE),或是利用第二影像感測單元14與第三影像感測單元16進行三角定位,以由下列公式二計算出觸控點30之座標(XE,YE)。 The present invention can set the XY right angle coordinate system and its coordinate origin O as shown in FIG. 2, where L is the length of the indication plane 10 and W is the width of the indication plane 10. The coordinates of the first image sensing unit 12 can be represented as (X A , Y A ), the coordinates of the second image sensing unit 14 can be represented as (X B , Y B ), and the coordinates of the third image sensing unit 16 can be It is expressed as (X C , Y C ), and the coordinates of the fourth image sensing unit 18 can be expressed as (X D , Y D ). As shown in FIG. 2, when a touch gesture is performed on the central touch area 104 to generate a touch point 30, the first image sensing unit 12 can sense that the angle of the touch point 30 is θ. A , the second image sensing unit 14 can sense that the angle of the touch point 30 is θ B , and the third image sensing unit 16 can sense that the angle of the touch point 30 is θ C and the fourth image The sensing unit 18 can sense that the angle with respect to the touch point 30 is θ D . It should be noted that the angles θ A , θ B , θ C , and θ D can be easily calculated by those skilled in the art of optical touch technology, and are not described herein again. Then, the first image sensing unit 12 and the fourth image sensing unit 18 can be used for triangulation to calculate the coordinates (X E , Y E ) of the touch point 30 by using the following formula 1 or by using the second The image sensing unit 14 and the third image sensing unit 16 perform triangulation to calculate the coordinates (X E , Y E ) of the touch point 30 by the following formula 2.
請參閱第4圖,第4圖為一個觸控點32產生於中央觸控區域104上的示意圖。如第4圖所示,中央觸控區域104中定義一第一臨界線110以及一第二臨界線112,其中第一臨界線110與第一邊界106相距一門檻距離T,且第二臨界線112亦與第二邊界108相距門檻距離T。 Please refer to FIG. 4 , which is a schematic diagram of a touch point 32 generated on the central touch area 104 . As shown in FIG. 4, a first critical line 110 and a second critical line 112 are defined in the central touch area 104, wherein the first critical line 110 is separated from the first boundary 106 by a threshold distance T, and the second critical line 112 is also spaced from the second boundary 108 by a threshold distance T.
如第4圖所示,當一觸控手勢於中央觸控區域104上進行操作而產生一個觸控點32時,第一影像感測單元12感測一第一影像I1,第二影像感測單元14感測一第二影像I2,第三影像感測單元16感測一第三影像I3,且第四影像感測單元18感測一第四影像I4。由於只有一個觸控點32產生於中央觸控區域104上,因此第一影像I1、第二影像I2、第三影像I3與第四影像I4中都只感測到一個對應的光遮斷訊號。接著,處理單元20即可根據第一影像I1與第四影像I4利用上述之公式一計算觸控點32之一第一座標(X1,Y1),根據第二影像I2與第三影像I3利用上述之公式二計算觸控點32之一第二座標(X2,Y2),且以一權重W整合第一座標(X1,Y1)與第二座標(X2,Y2),以計算觸控點32之一輸出座標(XT,YT)。 As shown in FIG. 4, when a touch gesture is performed on the central touch area 104 to generate a touch point 32, the first image sensing unit 12 senses a first image I1, and the second image sensing The unit 14 senses a second image I2, the third image sensing unit 16 senses a third image I3, and the fourth image sensing unit 18 senses a fourth image I4. Since only one touch point 32 is generated on the central touch area 104, only one corresponding light interception signal is sensed in the first image I1, the second image I2, the third image I3, and the fourth image I4. Then, the processing unit 20 calculates the first coordinate (X 1 , Y 1 ) of one of the touch points 32 according to the first image I1 and the fourth image I4 according to the first formula I, according to the second image I2 and the third image I3. Calculating the second coordinate (X 2 , Y 2 ) of one of the touch points 32 by using the above formula 2 , and integrating the first coordinate (X 1 , Y 1 ) and the second coordinate (X 2 , Y 2 ) by a weight W To calculate the output coordinate (X T , Y T ) of one of the touch points 32.
於此實施例中,處理單元20可藉由下列公式三計算觸控點32之輸出座標(XT,YT)。 In this embodiment, the processing unit 20 can calculate the output coordinates (X T , Y T ) of the touch point 32 by the following formula 3.
此外,權重W可以下列方式進行設定。當觸控點32位於第一邊界106與第一臨界線110之間時,權重W可設定為等於0;當觸控點32位於第二邊界108與第二臨界線112之間時,權重W可設定為等於1;當觸控點32位於第一臨界線110與第二臨界線112之間時,權重W可設定為等於,其中d表示觸控點32與第一邊界106間之距離,T表示上述之門檻距離,且D表示第一邊界106與第二邊界108間之距離。 Further, the weight W can be set in the following manner. When the touch point 32 is located between the first boundary 106 and the first critical line 110, the weight W can be set equal to 0; when the touch point 32 is located between the second boundary 108 and the second critical line 112, the weight W Can be set equal to 1; when the touch point 32 is located between the first critical line 110 and the second critical line 112, the weight W can be set equal to Where d represents the distance between the touch point 32 and the first boundary 106, T represents the threshold distance described above, and D represents the distance between the first boundary 106 and the second boundary 108.
當觸控點32位於第一邊界106與第一臨界線110之間時,表示觸控點32太過接近第一邊界106,因此可以根據第二影像I2與第三影像I3計算出之第二座標(X2,Y2)作為觸控點32之輸出座標(XT,YT),以避免觸控軌跡在第一邊界106與第一臨界線110之間產生偏移。同理,當觸控點32位於第二邊界108與第二臨界線112之間時,表示觸控點32太過接近第二邊界108,因此可以根據第一影像I1與第四影像I4計算出之第一座標(X1,Y1)作為觸控點 32之輸出座標(XT,YT),以避免觸控軌跡在第二邊界108與第二臨界線112之間產生偏移。 When the touch point 32 is located between the first boundary 106 and the first critical line 110, it indicates that the touch point 32 is too close to the first boundary 106, so the second image I2 and the third image I3 can be calculated second. The coordinates (X 2 , Y 2 ) serve as output coordinates (X T , Y T ) of the touch point 32 to prevent the touch track from shifting between the first boundary 106 and the first critical line 110. Similarly, when the touch point 32 is located between the second boundary 108 and the second critical line 112, it indicates that the touch point 32 is too close to the second boundary 108, and thus can be calculated according to the first image I1 and the fourth image I4. The first coordinate (X 1 , Y 1 ) is used as the output coordinate (X T , Y T ) of the touch point 32 to prevent the touch track from shifting between the second boundary 108 and the second critical line 112.
於第4圖所示之實施例中,觸控點32係位於第一臨界線110與第二臨界線112之間。因此,可以權重W=帶入上述之公式三來整合第一座標(X1,Y1)與第二座標(X2,Y2),以計算觸控點32之輸出座標(XT,YT)。藉此,本發明即可有效避免觸控軌跡在中央觸控區域104產生偏移,使得觸控軌跡在中央觸控區域104不會有不平滑的現象發生。 In the embodiment shown in FIG. 4, the touch point 32 is located between the first critical line 110 and the second critical line 112. Therefore, you can weight W= The first coordinate (X 1 , Y 1 ) and the second coordinate (X 2 , Y 2 ) are integrated into the above formula 3 to calculate the output coordinates (X T , Y T ) of the touch point 32. Therefore, the present invention can effectively prevent the touch track from being offset in the central touch area 104, so that the touch track does not have an unsmooth phenomenon in the central touch area 104.
需說明的是,當觸控點產生於中央觸控區域104左邊之左觸控區域114時,可利用第二影像感測單元14與第三影像感測單元16以一般的光學觸控原理來計算觸控點的數量與座標;當觸控點產生於中央觸控區域104右邊之右觸控區域116時,可利用第一影像感測單元12與第四影像感測單元18以一般的光學觸控原理來計算觸控點的數量與座標。 It should be noted that when the touch point is generated on the left touch area 114 on the left side of the central touch area 104, the second image sensing unit 14 and the third image sensing unit 16 can be used in the general optical touch principle. Calculating the number and coordinates of the touch points; when the touch points are generated in the right touch area 116 on the right side of the central touch area 104, the first image sensing unit 12 and the fourth image sensing unit 18 can be used for general optics. The touch principle is used to calculate the number and coordinates of touch points.
請參閱第5圖,第5圖為以第二影像感測單元14與第三影像感測單元16說明如何設定門檻距離T的示意圖。如第5圖所示,可利用第二影像感測單元14與第三影像感測單元16進行三角定位,以計算出所有可用的觸控點,其中第二影像感測單元14與第三影像感測單元16的感測角度範圍為0~90度,且間隔為1度,但不以此為限。在右方三角形區域中,可用的觸控點密度較稀疏,且可用觸控點分布呈現扇形向右擴散狀。因此,在此三角形區域中的觸控點偏移將會很嚴重。根據實際觸控準度的需求,可酌量取一角度閥值△θ,且假設第二影像感測單元14與第三影像感測單元16間之距離為H,則門檻距離T可設定為 tan△θ。舉例而言,若第二影像感測單元14與第三影像感測單元16間之距離H為120公分,可酌量取角度閥值△θ為10度,則門檻距離T可設定為10.57公分。 Please refer to FIG. 5 , which is a schematic diagram illustrating how the threshold distance T is set by the second image sensing unit 14 and the third image sensing unit 16 . As shown in FIG. 5, the second image sensing unit 14 and the third image sensing unit 16 can be used for triangulation to calculate all available touch points, wherein the second image sensing unit 14 and the third image are The sensing angle of the sensing unit 16 ranges from 0 to 90 degrees, and the interval is 1 degree, but is not limited thereto. In the right triangle area, the available touch points are relatively dense, and the touch point distribution can be fan-shaped to the right. Therefore, the touch point offset in this triangular area will be severe. According to the actual touch accuracy requirement, an angle threshold Δθ can be taken as appropriate, and if the distance between the second image sensing unit 14 and the third image sensing unit 16 is H, the threshold distance T can be set to Tan△θ. For example, if the distance H between the second image sensing unit 14 and the third image sensing unit 16 is 120 cm, and the angle threshold Δθ is 10 degrees, the threshold distance T can be set to 10.57 cm.
於此實施例中,在所有觸控點都位於中央觸控區域104的情況下,當處理單元20根據第一影像I1與第四影像I4計算出N個觸控點,且根據第 二影像I2與第三影像I3計算出M個觸控點時,處理單元20可先判斷N是否大於M,其中N與M皆為正整數。當N大於M時,處理單元20即會利用上述之公式一計算並輸出N個觸控點之座標。當N小於M時,處理單元20即會利用上述之公式二計算並輸出M個觸控點之座標。當N等於M時,處理單元20會先將N個觸控點與M個觸控點兩兩配對,以得到N對觸控點。接著,處理單元20再以上述之權重W利用上述之公式三整合N對觸控點之座標,以計算N對觸控點之N個輸出座標。 In this embodiment, when all the touch points are located in the central touch area 104, the processing unit 20 calculates N touch points according to the first image I1 and the fourth image I4, and according to the first When the two images I2 and the third image I3 calculate M touch points, the processing unit 20 may first determine whether N is greater than M, where N and M are both positive integers. When N is greater than M, the processing unit 20 calculates and outputs the coordinates of the N touch points using Equation 1 above. When N is less than M, the processing unit 20 calculates and outputs the coordinates of the M touch points using Equation 2 above. When N is equal to M, the processing unit 20 first pairs the N touch points and the M touch points to obtain N pairs of touch points. Then, the processing unit 20 integrates the coordinates of the N pairs with the touch points by using the above-mentioned weights W to calculate the N output coordinates of the N pairs of touch points.
請參閱第6圖,第6圖為二個觸控點34、36產生於中央觸控區域104上的示意圖。如第6圖所示,二個觸控點34、36於第一影像I1與第四影像I4中皆相互重疊,因此處理單元20根據第一影像I1與第四影像I4只會計算出一個觸控點(亦即,N=1)。此外,二個觸控點34、36於第二影像I2與第三影像I3中皆未重疊,因此處理單元20根據第二影像I2與第三影像I3可計算出二個觸控點(亦即,M=2)。此時,處理單元20可直接利用上述之公式二計算並輸出二個觸控點34、36之座標。藉此,即可避免因多個觸控點重疊而發生誤判的情況。 Please refer to FIG. 6. FIG. 6 is a schematic diagram of two touch points 34, 36 generated on the central touch area 104. As shown in FIG. 6, the two touch points 34 and 36 overlap each other in the first image I1 and the fourth image I4. Therefore, the processing unit 20 calculates only one touch according to the first image I1 and the fourth image I4. Point (ie, N=1). In addition, the two touch points 34 and 36 are not overlapped in the second image I2 and the third image I3. Therefore, the processing unit 20 can calculate two touch points according to the second image I2 and the third image I3 (ie, , M = 2). At this time, the processing unit 20 can directly calculate and output the coordinates of the two touch points 34, 36 by using the above formula 2. In this way, it is possible to avoid a case where a misjudgment occurs due to overlapping of a plurality of touch points.
請參閱第7圖,第7圖為二個觸控點38、40產生於中央觸控區域104上的示意圖。如第7圖所示,二個觸控點38、40於第二影像I2與第三影像I3中皆相互重疊,因此處理單元20根據第二影像I2與第三影像I3只會計算出一個觸控點(亦即,M=1)。此外,二個觸控點38、40於第一影像I1與第四影像I4中皆未重疊,因此處理單元20根據第一影像I1與第四影像I4可計算出二個觸控點(亦即,N=2)。此時,處理單元20可直接利用上述之公式一計算並輸出二個觸控點38、40之座標。藉此,即可避免因多個觸控點重疊而發生誤判的情況。 Please refer to FIG. 7. FIG. 7 is a schematic diagram of two touch points 38, 40 generated on the central touch area 104. As shown in FIG. 7, the two touch points 38 and 40 overlap each other in the second image I2 and the third image I3. Therefore, the processing unit 20 calculates only one touch according to the second image I2 and the third image I3. Point (ie, M=1). In addition, the two touch points 38 and 40 are not overlapped in the first image I1 and the fourth image I4. Therefore, the processing unit 20 can calculate two touch points according to the first image I1 and the fourth image I4 (ie, , N=2). At this time, the processing unit 20 can directly calculate and output the coordinates of the two touch points 38, 40 by using the above formula 1. In this way, it is possible to avoid a case where a misjudgment occurs due to overlapping of a plurality of touch points.
請參閱第8圖,第8圖為二個觸控點42、44產生於中央觸控區域104上的示意圖。如第8圖所示,二個觸控點42、44於第一影像I1、第二影像I2、第三影像I3與第四影像I4中皆未重疊,因此處理單元20根據第一影 像I1與第四影像I4可計算出二個觸控點42a、44a(亦即,N=2),且根據第二影像I2與第三影像I3亦可計算出二個觸控點42b、44b(亦即,M=2)。此時,處理單元20可根據每兩個觸控點間的最短距離將觸控點42a、44a、42b、44b兩兩配對,以得到二對觸控點。於第8圖所示之實施例中,觸控點42a、42b為一對,且觸控點44a、44b為一對。接著,處理單元20再以上述之權重W利用上述之公式三整合二對觸控點之座標,以計算二對觸控點之二個輸出座標。此二個輸出座標即為二個觸控點42、44之輸出座標。 Please refer to FIG. 8. FIG. 8 is a schematic diagram of two touch points 42 and 44 generated on the central touch area 104. As shown in FIG. 8, the two touch points 42 and 44 are not overlapped in the first image I1, the second image I2, the third image I3, and the fourth image I4, so the processing unit 20 is based on the first image. Two touch points 42a, 44a (ie, N=2) can be calculated for the I1 and the fourth image I4, and two touch points 42b, 44b can be calculated according to the second image I2 and the third image I3. (ie, M=2). At this time, the processing unit 20 can pair the touch points 42a, 44a, 42b, and 44b two by two according to the shortest distance between each two touch points to obtain two pairs of touch points. In the embodiment shown in FIG. 8, the touch points 42a, 42b are a pair, and the touch points 44a, 44b are a pair. Then, the processing unit 20 integrates the coordinates of the two pairs of touch points by using the above-mentioned weights W to calculate the two output coordinates of the two pairs of touch points. The two output coordinates are the output coordinates of the two touch points 42, 44.
請參閱第9圖,第9圖為本發明一實施例之觸控點座標之計算方法的流程圖。第9圖中的觸控點座標之計算方法適用於上述之光學觸控裝置1。此外,第9圖中的觸控點座標之計算方法之控制邏輯可以電路設計以及軟體設計來實現。首先,執行步驟S10,當觸控手勢於中央觸控區域104上進行操作時,第一影像感測單元12感測第一影像I1,第二影像感測單元14感測第二影像I2,第三影像感測單元16感測第三影像I3,且第四影像感測單元18感測第四影像I4。接著,執行步驟S12,根據第一影像I1與第四影像I4計算觸控點之第一座標,且根據第二影像I2與第三影像I3計算觸控點之第二座標。最後,執行步驟S14,以權重整合第一座標與第二座標,以計算觸控點之輸出座標。 Please refer to FIG. 9. FIG. 9 is a flow chart of a method for calculating a touch point coordinate according to an embodiment of the present invention. The calculation method of the touch point coordinates in FIG. 9 is applicable to the optical touch device 1 described above. In addition, the control logic of the calculation method of the touch point coordinates in FIG. 9 can be implemented by circuit design and software design. First, in step S10, when the touch gesture is performed on the central touch area 104, the first image sensing unit 12 senses the first image I1, and the second image sensing unit 14 senses the second image I2. The third image sensing unit 16 senses the third image I3, and the fourth image sensing unit 18 senses the fourth image I4. Then, step S12 is executed to calculate a first coordinate of the touch point according to the first image I1 and the fourth image I4, and calculate a second coordinate of the touch point according to the second image I2 and the third image I3. Finally, step S14 is performed to integrate the first coordinate and the second coordinate with weights to calculate the output coordinates of the touch point.
請參閱第10圖,第10圖為本發明另一實施例之觸控點座標之計算方法的流程圖。第10圖中的觸控點座標之計算方法適用於上述之光學觸控裝置1。此外,第10圖中的觸控點座標之計算方法之控制邏輯可以電路設計以及軟體設計來實現。首先,執行步驟S20,根據第一影像I1與第四影像I4計算出N個觸控點,且根據第二影像I2與第三影像I3計算出M個觸控點,其中N與M皆為正整數。接著,執行步驟S22,判斷N是否大於M。當N大於M時,執行步驟S24,計算並輸出N個觸控點之座標。當N小於M時,執行步驟S26,計算並輸出M個觸控點之座標。當N等於M時,執行步驟S28,將N個觸控點與M個觸控點兩兩配對,以得到N對觸控點。接著,執 行步驟S30,以權重整合N對觸控點之座標,以計算N對觸控點之N個輸出座標。 Please refer to FIG. 10 , which is a flowchart of a method for calculating a touch point coordinate according to another embodiment of the present invention. The calculation method of the touch point coordinates in FIG. 10 is applicable to the optical touch device 1 described above. In addition, the control logic of the calculation method of the touch point coordinates in FIG. 10 can be implemented by circuit design and software design. First, step S20 is performed to calculate N touch points according to the first image I1 and the fourth image I4, and M touch points are calculated according to the second image I2 and the third image I3, wherein N and M are positive Integer. Next, step S22 is performed to determine whether N is greater than M. When N is greater than M, step S24 is performed to calculate and output coordinates of the N touch points. When N is less than M, step S26 is performed to calculate and output the coordinates of the M touch points. When N is equal to M, step S28 is performed to pair N touch points and M touch points to obtain N pairs of touch points. Then, In step S30, the coordinates of the N pairs of touch points are integrated by weights to calculate N output coordinates of the N pairs of touch points.
需說明的是,關於本發明之觸控點座標之計算方法的其它工作原理係如上所述,在此不再贅述。 It should be noted that other working principles of the calculation method of the touch point coordinates of the present invention are as described above, and are not described herein again.
綜上所述,本發明係利用權重整合兩組影像感測單元感測到的觸控點的兩組座標,以計算作用於中央觸控區域上之觸控點之輸出座標。藉此,本發明即可有效避免觸控軌跡在中央觸控區域產生偏移,使得觸控軌跡在中央觸控區域不會有不平滑的現象發生。 In summary, the present invention integrates two sets of coordinates of the touch points sensed by the two sets of image sensing units by using weights to calculate the output coordinates of the touch points acting on the central touch area. Therefore, the present invention can effectively prevent the touch track from being offset in the central touch area, so that the touch track does not have an unsmooth phenomenon in the central touch area.
以上所述僅為本發明之較佳實施例,凡依本發明申請專利範圍所做之均等變化與修飾,皆應屬本發明之涵蓋範圍。 The above are only the preferred embodiments of the present invention, and all changes and modifications made to the scope of the present invention should be within the scope of the present invention.
1‧‧‧光學觸控裝置 1‧‧‧Optical touch device
10‧‧‧指示平面 10‧‧‧ indicating plane
12‧‧‧第一影像感測單元 12‧‧‧First image sensing unit
14‧‧‧第二影像感測單元 14‧‧‧Second image sensing unit
16‧‧‧第三影像感測單元 16‧‧‧ Third image sensing unit
18‧‧‧第四影像感測單元 18‧‧‧Four image sensing unit
32‧‧‧觸控點 32‧‧‧ Touch points
100‧‧‧第一側邊 100‧‧‧ first side
102‧‧‧第二側邊 102‧‧‧Second side
104‧‧‧中央觸控區域 104‧‧‧Central touch area
106‧‧‧第一邊界 106‧‧‧First border
108‧‧‧第二邊界 108‧‧‧second border
110‧‧‧第一臨界線 110‧‧‧first critical line
112‧‧‧第二臨界線 112‧‧‧second critical line
114‧‧‧左觸控區域 114‧‧‧ Left touch area
116‧‧‧右觸控區域 116‧‧‧Right touch area
I1‧‧‧第一影像 I1‧‧‧ first image
I2‧‧‧第二影像 I2‧‧‧second image
I3‧‧‧第三影像 I3‧‧‧ third image
I4‧‧‧第四影像 I4‧‧‧ fourth image
D、d‧‧‧距離 D, d‧‧‧ distance
L‧‧‧長度 L‧‧‧ length
W‧‧‧寬度 W‧‧‧Width
O‧‧‧座標原點 O‧‧‧ coordinate origin
T‧‧‧門檻距離 T‧‧‧ threshold distance
X-Y‧‧‧直角座標系統 X-Y‧‧‧ Right Angle Coordinate System
Claims (10)
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
TW102143861A TWI553531B (en) | 2013-11-29 | 2013-11-29 | Optical touch device and method for calculating coordinate of touch point |
CN201310722526.6A CN104679353B (en) | 2013-11-29 | 2013-12-17 | Optical touch device and calculation method of touch point coordinates |
US14/295,336 US9075480B2 (en) | 2013-11-29 | 2014-06-04 | Optical touch device and method for calculating coordinate of touch point |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
TW102143861A TWI553531B (en) | 2013-11-29 | 2013-11-29 | Optical touch device and method for calculating coordinate of touch point |
Publications (2)
Publication Number | Publication Date |
---|---|
TW201520861A TW201520861A (en) | 2015-06-01 |
TWI553531B true TWI553531B (en) | 2016-10-11 |
Family
ID=53935037
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
TW102143861A TWI553531B (en) | 2013-11-29 | 2013-11-29 | Optical touch device and method for calculating coordinate of touch point |
Country Status (1)
Country | Link |
---|---|
TW (1) | TWI553531B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI547849B (en) * | 2015-06-12 | 2016-09-01 | 緯創資通股份有限公司 | Optical sensing electronic devices and optical sensing method |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TW201030579A (en) * | 2009-02-10 | 2010-08-16 | Quanta Comp Inc | Optical touch displaying device and operating method thereof |
TW201112092A (en) * | 2009-09-24 | 2011-04-01 | Acer Inc | Optical touch system and method thereof |
US20110261013A1 (en) * | 2010-04-26 | 2011-10-27 | Hon Hai Precision Industry Co., Ltd. | Touch screen system based on image recognition |
TW201214243A (en) * | 2010-09-29 | 2012-04-01 | Pixart Imaging Inc | Optical touch system and object detection method therefor |
TW201344532A (en) * | 2012-04-19 | 2013-11-01 | Wistron Corp | Optical touch device and touch sensing method |
-
2013
- 2013-11-29 TW TW102143861A patent/TWI553531B/en active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TW201030579A (en) * | 2009-02-10 | 2010-08-16 | Quanta Comp Inc | Optical touch displaying device and operating method thereof |
TW201112092A (en) * | 2009-09-24 | 2011-04-01 | Acer Inc | Optical touch system and method thereof |
US20110261013A1 (en) * | 2010-04-26 | 2011-10-27 | Hon Hai Precision Industry Co., Ltd. | Touch screen system based on image recognition |
TW201214243A (en) * | 2010-09-29 | 2012-04-01 | Pixart Imaging Inc | Optical touch system and object detection method therefor |
TW201344532A (en) * | 2012-04-19 | 2013-11-01 | Wistron Corp | Optical touch device and touch sensing method |
Also Published As
Publication number | Publication date |
---|---|
TW201520861A (en) | 2015-06-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5852050B2 (en) | Touch detection device, display device with touch detection function, and electronic device | |
US8743065B2 (en) | Method of identifying a multi-touch rotation gesture and device using the same | |
US20100079391A1 (en) | Touch panel apparatus using tactile sensor | |
TWI461975B (en) | Electronic device and method for correcting touch position | |
US20130038577A1 (en) | Optical touch device and coordinate detection method thereof | |
TWI470510B (en) | Optical touch device and touch sensing method | |
TW201525848A (en) | Device with multiple tough screens and method of detecting and determining adjacent junctions of multiple touch screens | |
TW201530417A (en) | Portable apparatus and method for adjusting window size thereof | |
CN108200416A (en) | Coordinate mapping method, device and the projection device of projected image in projection device | |
TWI590131B (en) | Optical touch device and method for detecting touch point | |
US10037107B2 (en) | Optical touch device and sensing method thereof | |
US20130027342A1 (en) | Pointed position determination apparatus of touch panel, touch panel apparatus, electronics apparatus including the same, method of determining pointed position on touch panel, and computer program storage medium | |
US20120127129A1 (en) | Optical Touch Screen System and Computing Method Thereof | |
TWI553531B (en) | Optical touch device and method for calculating coordinate of touch point | |
TWI502413B (en) | Optical touch device and gesture detecting method thereof | |
CN105653101B (en) | Touch point sensing method and optical touch system | |
TWI419011B (en) | Method and system for tracking touch point | |
US9075480B2 (en) | Optical touch device and method for calculating coordinate of touch point | |
JP2018156535A (en) | Touch pad | |
TWI421755B (en) | Touch panel and touching point detection method thereof | |
TWI611343B (en) | Operating method of touch panel and display module using the same | |
TWI566128B (en) | Virtual control device | |
TWI547849B (en) | Optical sensing electronic devices and optical sensing method | |
JP2012059049A (en) | Touch panel device | |
JP2003076484A (en) | Method of calculating size of coordinate indicating object, coordinate input device, and program for calculating size of coordinate indicating object |