WO2002023122A1 - Systeme de detection de la position d'un corps mobile - Google Patents

Systeme de detection de la position d'un corps mobile Download PDF

Info

Publication number
WO2002023122A1
WO2002023122A1 PCT/JP2001/007877 JP0107877W WO0223122A1 WO 2002023122 A1 WO2002023122 A1 WO 2002023122A1 JP 0107877 W JP0107877 W JP 0107877W WO 0223122 A1 WO0223122 A1 WO 0223122A1
Authority
WO
WIPO (PCT)
Prior art keywords
light emitting
movable object
light
pattern
emitting means
Prior art date
Application number
PCT/JP2001/007877
Other languages
English (en)
Japanese (ja)
Inventor
Kunikatsu Takase
Yoshiro Hada
Original Assignee
Kunikatsu Takase
Yoshiro Hada
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kunikatsu Takase, Yoshiro Hada filed Critical Kunikatsu Takase
Priority to AU2001284519A priority Critical patent/AU2001284519A1/en
Priority to JP2002527723A priority patent/JPWO2002023122A1/ja
Publication of WO2002023122A1 publication Critical patent/WO2002023122A1/fr

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/16Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
    • G01S5/163Determination of attitude
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S3/00Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
    • G01S3/78Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using electromagnetic waves other than radio waves
    • G01S3/782Systems for determining direction or deviation from predetermined direction
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0242Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/028Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using a RF signal
    • G05D1/0282Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using a RF signal generated in a local control room

Definitions

  • the present invention relates to a movable object position detection system that recognizes and controls the position of a movable object, and particularly relates to a movable object position suitable for easily recognizing and controlling the position and orientation of a movable object such as a moving robot. Regarding the detection system. Background art
  • An autonomous robot is basically a robot that performs autonomous work without the help of humans.
  • Autonomous robots respond to environmental changes by repeating the cycle of planning and executing actions in real time.
  • a lopot works.
  • objects such as floors, partitions, desks, chairs and books.
  • the basis for describing an environment is to identify objects and indicate their position and orientation. The details include the state of the open book left in the room, the accumulation of dust in the room, and in extreme cases the arrangement of atoms and molecules describes the environment. Become.
  • Japanese Unexamined Patent Publication No. H10-149349 proposes that, based on this premise, a mark is added in advance only to a necessary object. A per code is added to this mark so that the type of object can be identified by recognizing it.
  • Figure 18 shows an example of a mark attached to an object.
  • a mark 102 is attached to the object 101.
  • the mark 102 is made up of marks 103 and 104 readable by visible light and a par code 105.
  • the two landmarks 103 and 104 have a concentric pattern in order to eliminate the directionality of the discrimination, and a par code 105 arranged between the landmarks 103 and 104. Is different for each object.
  • Figure 19 shows an overview of the proposed system for recognizing the positions of multiple objects.
  • the rooms 1 2 1 are present a plurality of object 1 0 1 to 1 0 1 5. These objects 1 0; ⁇ - E 0 l mark 1 to s 0 2 - 1 0 2 s are attached.
  • TV cameras 1 2 2 are mounted on the ceiling.
  • Television camera 1 2 2 video signal 1 2 3 outputted from the dividing line recognition of these objects 1 0 1 to 1 0 1 5 are input to the environment recognition system 1 2 4.
  • the control signal 125 is fed back to the television camera 122, and the magnified (zoom) plane image for reading the par code 105 shown in Fig. 18 is displayed. Control is performed.
  • the barcode may not be legible.
  • the object 101 to be recognized is composed of a plurality of movable objects such as a plurality of mouth pots and other movable objects, the moving position and posture of these objects are determined. It was practically impossible to judge in real time and control these movements.
  • Another object of the c the present invention is to provide a movable object position detection system capable of recognizing the position and orientation of the movable object, such as Ropo' DOO easily at multiple moving objects It is an object of the present invention to provide a movable object image position detection system that enables such movement control. Disclosure of the invention
  • the first aspect of the present invention (a) one or a plurality of light emitting means attached to a movable object and outputting a unique light emitting pattern, and (mouth) a predetermined sky (C) imaging means for receiving a light emitting pattern output from the light emitting means located in the space and output from the light emitting means present in the space; Identification means to identify the means.
  • Notifying means for notifying a predetermined one of the movable objects of position information indicating the position in the space of each light emitting means identified by the identifying means.
  • each of the unique light emitting patterns output from one or more light emitting means attached to the movable object is received by the imaging means, and which of the light emitting means emits light by the identification means. Is identified. Then, the position information indicating the position in the space of each light emitting unit identified by the identification unit is notified to a predetermined movable object by the notification unit.
  • the reason for notifying a predetermined movable object is that if the movable object itself does not have a function of receiving the notification, it is not necessary to perform the notification, and the function of receiving the notification is not necessary. This is because there is a case where the notification is not necessary even for the movable object that the user has, such as the notification for the movable object that is not involved.
  • the imaging means since the imaging means specifies each light emitting means and notifies the movable object side of information indicating their positions, the movable object side has its own position and its movable object. Not only can the position and orientation of the light-emitting means attached to the object be grasped, but also the position information of the path of the movable object and the surroundings of other movable objects can be acquired, reducing the burden of these processes. Can be reduced.
  • the imaging means is mounted at a position where the movable object can easily grasp the movable object such as the ceiling where the movable object moves, it will be easier to grasp the position of each object than when the movable object itself is imaged and the position of each object is grasped. It is easy to grasp a movable object.
  • the second aspect of the present invention (a) one or more light emitting means attached to a movable object and outputting a unique light emitting pattern, and (mouth) disposed in a predetermined space and present in this space
  • Imaging means for receiving a light emission pattern output from the light emission means
  • identification means for identifying each light emission means present in a predetermined space from the light emission pattern received by the imaging means
  • the position information indicating the position in the space of each light emitting means identified by the identification means and the attitude as a spatial arrangement of the movable object to which the light emitting means is attached is a predetermined information of the movable object.
  • notification means for notifying the user.
  • each of the unique light-emitting patterns output from one or more light-emitting means attached to the movable object is received by the imaging means, and which of the light-emitting means emits light by the identification means. Is identified.
  • the reason for notifying a predetermined movable object is that if the movable object itself does not have a function of receiving the notification, it is not necessary to give a notification, and it has a function of receiving the notification. This is because even a moving object that does not need to be notified like a notification about a moving object that is not involved.
  • the position information indicates the position of the light emitting means in the space or the posture of the movable object.
  • the imaging means specifies the respective light emitting means and notifies the movable object side of the information indicating their positions. Not only can the position and orientation of the light emitting means attached to the movable object be grasped, but also the position and orientation of the other movable object with respect to the course and surroundings of the movable object can be obtained, and the burden of these processes Can be reduced. Moreover, if the imaging means is installed at a position where the movable object can easily grasp the movable object, such as the ceiling where the movable object moves, compared to when the movable object itself is imaged and the position of each object is grasped, It is easy to grasp each movable object.
  • the light emitting means is a means for outputting infrared light
  • the imaging means selectively emits infrared light. It is a means for receiving light.
  • the light emitting means is a means for outputting infrared light outside the visible region, human beings may not notice the presence of these light emitting means. There is no fuss. Further, since the imaging means is a means for selectively receiving infrared light, the influence of noise due to visible light or ultraviolet light can be removed, and the processing reliability can be improved accordingly. '
  • the light emitting means has a two-dimensional arrangement structure in which light emitting elements are arranged in a matrix. It is characterized in that an on / off pattern indicating whether or not the element emits light is exclusively assigned to each light emitting means, so that a unique light emitting pattern is output for each.
  • the light-emitting means since the light-emitting means has a two-dimensional arrangement structure in which light-emitting elements are arranged in a matrix, different light-emitting patterns can be obtained using the same light-emitting means. Therefore, the light-emitting means can be used as a common component, so that the cost can be reduced and the management of the component can be facilitated.
  • the light emission pattern output by one light emitting means can be sequentially changed over time, it is necessary to return several responses in time, such as a response to a general call or a response to an individual call. Becomes possible.
  • the light emitting means has a one-dimensional arrangement structure in which light emitting elements are arranged at regular intervals on the X axis.
  • the first light emitting element arranged at one end corresponds to the position of the origin of the coordinates, and the distance between the first light emitting element and the light emitting element arranged at the other end is predetermined.
  • a unique light emission pattern is output by exclusively assigning an on / off pattern that indicates the presence or absence of light emission of a predetermined number of light emitting elements existing between the light emitting means to each light emitting means. .
  • the light emitting means since the light emitting means has a one-dimensional arrangement structure in which light emitting elements are arranged at regular intervals on the X axis, the light emitting means can be attached to an elongated object.
  • the distance between the first light emitting element and the light emitting element disposed at the other end is predetermined, if the total number of light emitting elements is known, the predetermined number of light emitting elements existing between them is determined.
  • Information such as ID information can be identified by an on / off pattern that indicates whether or not the light emitting element emits light. is there.
  • the first light emitting element and the second light emitting element are turned on (light emitting), the last light emitting element is turned on, and the immediately preceding light emitting element is turned off, so that the total number of light emitting elements can be known. The analysis of these on and off patterns becomes possible even without them.
  • the light emitting elements each have a unit structure as an aggregate of a plurality of light emitting diodes. I have.
  • the amount of light and the size of the light-emitting region can be increased, and the distance between the light-emitting diode and the imaging means is increased. In such a case, the pattern can be recognized reliably.
  • the notifying means includes not only the position information of the light emitting means of the notification destination but also the position information of other light emitting means.
  • the feature is to notify.
  • the notifying means notifies not only the position information of the light emitting means of the notification destination but also the position information of the other light emitting means. Free from image processing to perform Further, since the imaging means notifies the position information acquired from an easily viewable position such as a ceiling, there is a high possibility that the information can be obtained with higher reliability than the information acquired by the movable object itself.
  • the light emitting means identified by the identification means moves within a relatively short distance per unit time.
  • the method is characterized in that a tracking means is provided for tracking the same object without collating the light emitting pattern by repeatedly tracking the light emitting means existing near the position determined last time on the image.
  • the tracking means is such that the light emitting means identified by the identification means is relatively short per unit time. When moving within the range of the distance, the light emitting means existing near the position determined last time is repeatedly tracked on the image.
  • the corresponding object is identified from the light emission pattern received by the imaging means using the identification means. It is characterized by:
  • the vicinity of the previously determined position as the target area when the tracking means tracks is determined by the moving speed of the movable object. It is characterized in that its range is determined.
  • efficient tracking is enabled by determining the width of the search area of the light emitting means to be tracked by the tracking means according to the moving speed of the movable object.
  • a light emitting means light emitting selecting means for instructing only a specific light emitting means to emit light is provided.
  • the identification unit moves or stops only the movable object to which the specific light emitting unit is attached during the specific period.
  • the light emitting means that is moving or stopped during the specific period is identified as the specific light emitting means.
  • the specific light emitting means can be used during a specific period. Only the attached movable object is moved or stopped, and the light-emitting means moving or stopped during this specific period is identified as the specific light-emitting means so that the specific target can be easily identified. I have.
  • the invention according to claim 13 it is possible to easily specify the light emitting device by causing the light emitting device to emit light only to the light emitting device that cannot be tracked by the tracking device.
  • the invention of claim 14 (a) one or a plurality of light emitting means attached to the movable object and outputting a unique light emitting pattern, and (mouth) disposed in a predetermined space and present in this space
  • Imaging means for receiving a light-emitting pattern output from the light-emitting means that emits light
  • the movable object position detection system includes: a rotation angle determining means for determining; and (e) a notifying means for notifying a predetermined movable object of position information indicating at least the rotation angle determined by the rotation angle determining means.
  • each movable object has a unique coordinate, and when the imaging means observes in the reference coordinate system, the movable object is used in any rotation direction by using this.
  • the movement of a movable object to a plane in a specific direction in the object coordinate system with respect to the movable object is achieved by instructing a specific direction movement instruction means. If only one light emitting means is attached to a movable object, the rotation angle of the movable object cannot be known on the reference coordinate system side.Therefore, it becomes necessary to arrange a plurality of light emitting means on one movable object.
  • the rotation angle can be specified by attaching only one light emitting means to the movable object by the direction movement instruction means.
  • the present invention has an advantage that the rotation angle can be specified even in a situation where a part of the plurality of light emitting units arranged on the movable object cannot be captured by the imaging unit due to an obstacle or the like.
  • FIG. 1 is a schematic configuration diagram illustrating an outline of a configuration of a movable object position detection system according to a first embodiment of the present invention.
  • FIG. 2 is an explanatory diagram showing an example of an arrangement pattern of the first to third light emitting elements in the present embodiment.
  • FIG. 3 is a plan view showing a first example of a total surface pattern that can be obtained on the assumption that three set patterns are arranged on one surface of the object.
  • FIG. 4 is a plan view showing a second example of a possible overall surface pattern assuming that three set patterns are arranged on one surface of the object.
  • FIG. 5 is a plan view showing a third example of a possible overall surface pattern when it is assumed that three set patterns are arranged on one surface of the object.
  • FIG. 6 is a plan view showing a fourth example of a total surface pattern that can be taken on the assumption that three group patterns are placed on one surface of the object.
  • Fig. 7 is a plan view showing a fifth example of a possible overall surface pattern assuming that three set patterns are arranged on one surface of the object.
  • FIG. 8 is a plan view showing a sixth example of a total surface pattern that can be obtained on the assumption that three set patterns are arranged on one surface of the object.
  • FIG. 9 is a schematic configuration diagram illustrating an outline of a configuration of a movable object position detection system according to a second example of the present invention.
  • FIG. 10 shows the configuration of a two-dimensional bit pattern in the second embodiment.
  • FIG. 11 is a flowchart illustrating a flow of a process of identifying each object by the image processing apparatus according to the second embodiment.
  • FIG. 12 is a plan view illustrating an example of an initial light emission pattern.
  • FIG. 13 is a plan view showing a two-dimensional bit pattern of a first object in the second embodiment.
  • FIG. 14 is a plan view showing a two-dimensional bit pattern of a second object in the second embodiment.
  • FIG. 15 is an explanatory diagram showing a one-dimensional bit pattern that can be used instead of the two-dimensional bit pattern in one of the modified embodiments of the present invention.
  • FIG. 16 is an explanatory diagram showing the configuration of a light emitting unit suitable for use in an environment where the distance between the television camera and the object is considerably large.
  • C Fig. 17 is a modification of the present invention.
  • FIG. 6 is a timing chart showing a relationship between a shooting cycle of a television camera and a switching cycle of light emission of a light emitting element according to one of the embodiments.
  • FIG. 18 is a plan view showing an example of a mark pasted on a conventionally proposed object.
  • FIG. 19 is a schematic configuration diagram of a system for recognizing the positions of a plurality of objects according to this proposal.
  • FIG. 1 shows an outline of a configuration of a movable object position detection system according to a first embodiment of the present invention.
  • One TV camera 203 is arranged on the ceiling 202 of the room 201.
  • the TV camera 203 is equipped with an infrared light transmission filter 204 for blocking visible light and transmitting only infrared light.
  • An object 206 as a moving object is movably arranged on the floor 205.
  • first to third light-emitting elements 207 to 209 each including an infrared light-emitting diode are attached at one or a plurality of locations. These light emitting elements 207 Each 209 emits infrared light having a wide range of directivity (no directivity).
  • Infrared light cannot be recognized by the naked eye.
  • the TV camera 203 of this embodiment is sensitive to a wide range from visible light to infrared light
  • the infrared camera 200 is equipped with the infrared light transmission filter 204 so that it can selectively receive only infrared light. It's glowing.
  • the video signal 2 11 output from the television camera 203 is transmitted through the video transmission cable '2 12 and input to the image processing device 2 13.
  • the image processing device 2 13 processes the video signal 2 11 to obtain position information of the light emitting element with respect to the camera coordinate system fixed on the imaging plane of each camera. Next, this is converted into position information on a reference coordinate system (world coordinate system) set on the environment side using conversion parameters obtained in advance by camera calibration.
  • a reference coordinate system world coordinate system
  • the television camera 203 is one camera
  • the height (Z-direction coordinate) of the light emitting element from the floor surface 205 is known, and the position is determined.
  • the television camera 203 is a stereo camera, the position information of the light emitting element at an arbitrary position in the field of view can be obtained.
  • the X and Y coordinate positions of the object 206 with respect to the reference coordinate system (world coordinate system) and the rotation angle (posture) ) Is determined.
  • Information representing the X and Y coordinate positions and orientations (hereinafter referred to as position information) as the result of this determination is transmitted from the first antenna 215.
  • the object 206 is provided with a second antenna 211, and receives this position information.
  • the communication path between the television camera 203 and the image processing device 211 is not limited to the cable 212 and may be a wireless channel.
  • FIG. 2 shows an example of an arrangement pattern of the first to third light emitting elements in the present embodiment.
  • the first light emitting element 2 07 is used as the source of the coordinates on the object 206 side.
  • a line segment 2 21 connecting the first light emitting element 207 to the second light emitting element 208 is set as the X axis, and the first light emitting element 207 is connected to the third light emitting element 2
  • the line segment 2 2 2 connecting 0 9 is defined as the Y axis. That is, the first to third light-emitting elements 207 to 209 are arranged in a predetermined region on the surface of the object 206 so that the intersection of the two line segments 2 2 1 and 2 2 2 is always at a right angle. Have been.
  • the length 1 ⁇ of the X-axis component 2 2 1 is a known fixed length, and the length L 2 of the Y-axis component 2 2 2 is specific to the combination of these light emitting elements 207 to 209 Length.
  • the length L i is always 2 cm, but the length L 2 is a different length, such as 1 cm, 2 cm, or 3 cm.
  • the object 206 moves in a state where the surface on which the first to third light emitting elements 207 to 209 are arranged keeps a surface parallel to the X and Y axis planes.
  • television camera 2 0 3 can identify the placed surface or an object itself of the light emitting element 2 0 7-2 0 9 and a ratio of length to L 2.
  • a plurality of such first to third light-emitting elements 207 to 209 are arranged at different places of one object 206, such as a robot arm, and these are arranged in an image processing apparatus 2 By analyzing in step 13, it is possible to determine the movement in the Z-axis direction or the change in posture other than the rotation angle of the object 206.
  • FIGS. 3 to 8 show possible patterns when three patterns each of the first to third light emitting elements are arranged on one surface of an object (hereinafter referred to as a group pattern). It shows some examples of a comprehensive pattern (hereinafter referred to as a surface comprehensive pattern).
  • a set of three patterns, A, B, and C is placed on the surface Are arranged in a relationship.
  • the length L 2 is 1 cm
  • the length L 2 is 2 cm
  • the length L 2 is S 3 cm.
  • FIG. 4 three yarn patterns A, B, and D are arranged on the surface 232 in the positional relationship shown in FIG.
  • D patterns is the length L 2 is 4 cm.
  • the yarn patterns A and B are common, the overall surface pattern differs depending on the set patterns C and D that are not common to each other. That is, individual identification of these objects becomes possible.
  • the set patterns A, B, and C are all common to the surface 231, the places where these set patterns are arranged are switched. Therefore, the overall surface pattern is different.
  • the position where the set pattern is arranged is different on the surface 234 shown in FIG.
  • the number of the group pattern A is increased from one to more, thereby making the overall surface pattern different. In this way, by arranging a plurality of set patterns on one surface, it is possible to make the overall surface patterns different from each other without making all these set patterns different.
  • FIG. 9 shows an outline of a configuration of a movable object position detecting system according to a second embodiment of the present invention.
  • the same parts as those in FIG. 1 are denoted by the same reference numerals, and description thereof will be omitted as appropriate.
  • a plurality of television cameras 203 i to 203 N are arranged on a ceiling 202 of a room 201. These television camera 2 0 3 ⁇ 2 0 3 N, the infrared light transmission over filter 2 0 4 for transmitting only the infrared light by blocking visible light is mounted. On the floor 2 0 5, a plurality of objects 2 0 6 i to 2 0 6 M as a moving object is arranged movably. A plurality of television cameras 2 0 3, ⁇ 2 0 3 N that are arranged, widened recognizable area of the object 2 0 6.
  • Each light-emitting element constituting these two-dimensional bit patterns 301 emits infrared light having a wide range of directivity (no directivity).
  • the television camera SOS i SOSN of this embodiment is sensitive to a wide range from visible light to infrared light, only infrared light can be selectively used because the infrared light transmission filter 204 is mounted. To receive light.
  • the video signals 2 11 i to 2 11 N output from the television camera 203 are transmitted through the video transmission cable 2 12 and input to the image processing device 2 13 A.
  • the image processing device 2 13 A processes the video signals 2 11 to 2 11 N to calculate the X, Y plane of each object 206 to 206 M , that is, the coordinate position parallel to the floor surface 205.
  • the rotation (posture) of the part where the two-dimensional bit patterns 301 i to 301 M are attached is determined.
  • position information representing the X and Y coordinate positions and orientation (position information) as the result of this determination is transmitted from the first antenna 215.
  • Each object 2 0 6 ⁇ 2 0 6 M second antenna ⁇ ! ⁇ ? A key is provided to receive this position information.
  • the configuration of the image processing apparatus 21 A is the same as that of the first embodiment except that there is a slight difference in the control program stored in a storage medium (not shown).
  • the communication path between the television camera 203 and the image processing device 213A is not limited to the cable 212, but may be a wireless connection.
  • FIG. 10 shows the configuration of a two-dimensional bit pattern.
  • a third light-emitting element 2 4 1 3 from the first light-emitting element 2 4 a line connecting a ⁇
  • a third light-emitting element 2 4 1 from 3 of the ninth light-emitting element 2 4 1 9 forming department line is the Y-axis direction.
  • FIG. 11 illustrates a flow of a process of identifying each object by the image processing apparatus according to the second embodiment.
  • Such an object identification process is repeated at a required period, for example, once a second.
  • the CPU (not shown) of the image processing apparatus 2 13 A selects an object 206 required for identification at each time, and instructs all of them to emit light with the initial light emission pattern (step S 3 twenty one ).
  • This instruction is wirelessly transmitted from the first antenna 2 15 to each object 206. That is, the light emission is instructed only to the same target object 206 to minimize the possibility of noise.
  • FIG. 12 shows an example of an initial light emission pattern in which each object selectively emits light using nine light emitting elements based on this instruction.
  • the initial light emission pattern is the same as for all light emitting elements 2 4 1 2 4 1 2 and 2 4 1 4 to 2 4 1 except for the third light emitting element 2 4 1 3 of the light emitting elements 2 4 1 to 2 4 1 9 Achieved by turning on 9 .
  • a process of complementing the image signal 211 captured by another TV camera 203 becomes possible.
  • each object 206 is present in the distance Even in such a case, the overall light emission of each two-dimensional bit pattern 301 is increased, and initial position confirmation is facilitated.
  • the CPU uses the light receiving pattern of each object 206 that emitted light in the initial light emitting pattern. Based on these, the posture of each object as a rotation state is grasped from these positions and the reference coordinates (world coordinates) of the room 201 (step S322). However, when light emission is instructed to a plurality of objects at the same time with the same initial light emission pattern, these cannot be individually identified yet at this stage. Then, the CPU instructs each object 206 whose arrangement has been grasped to emit light in a two-dimensional bit pattern 301 as a unique pattern this time (step S332). This instruction is transmitted wirelessly to the corresponding object 206 as before.
  • FIG. 13 shows a two-dimensional bit pattern of the first object as an example
  • FIG. 14 similarly shows a two-dimensional bit pattern of the second object.
  • These two-dimensional bit pattern 3 0 1 I 3 0 1 2 has a mutually different patterns as shown in this example, the image processing apparatus 2 1 3 side registered in the storage medium each pattern in advance are doing. Therefore, the CPU can identify these objects 206 by pattern matching (step S324).
  • the image processing device 211 can determine the position and rotation state of each object 206 at each time point. Instead of performing such a two-step process, each object 206 is individually called at a different time, and an object that is lit in response to the call or a predetermined object is responsive to this. It is also possible to identify the object that started lighting in the cycle of, as the called object. Of course, it is also possible to cause each object 206 to emit light in a two-dimensional bit pattern 301 unique to each of them.
  • ⁇ 2 0 3 N video signals 2 1: ⁇ can be in conjunction with ⁇ 2 1 1 N processing to determine the three-dimensional position or orientation of the plurality of positions in chronological order.
  • the present invention is applied to a mouth bot system used for nursing, it can be applied not only to the body and arms of the lopot, but also to the patient's clothing, bedding, and other objects such as chopsticks used for eating.
  • the dimensional bit pattern 301 By arranging the dimensional bit pattern 301, it is possible to obtain positional information of the robot and each part of the patient, and it is possible to perform fine positional control for nursing. In places where it is difficult to control the on / off control of the light emission of the two-dimensional bit pattern 301, light-emitting components that constantly emit infrared light in the specified two-dimensional bit pattern 301 should be installed where necessary. Is also good. For example, a tape-shaped component that absorbs energy such as visible light and emits light in a specific two-dimensional bit pattern 301 is attached to a required location.
  • FIG. 15 shows a one-dimensional bit pattern that can be used in place of the two-dimensional bit pattern described above.
  • This one-dimensional bit pattern 340 is created by a plurality of light-emitting elements S li SlK arranged linearly and at equal intervals in the X-axis direction of the object 206 shown in FIG. .
  • the first light emitting element 3 4: ⁇ is a start bit located at the origin of the coordinates
  • the K th light emitting element 3 4 1 ⁇ is an end bit located at the final position of the bit string in the X-axis direction. It is.
  • the distance L Q between the first light-emitting element 34 1 and the K-th light-emitting element 3 4 1 ⁇ is a predetermined constant length.
  • the first light emitting element 341 i and the second light emitting element are always on (lit), and the 1st ( ⁇ 1) light emitting element 3 4 1 ⁇ ⁇ is always off (off). Swelling.
  • the first light-emitting element 3 4 1 and each of the light-emitting element 3 4 1 second interval of the light-emitting element 3 4 1 2 is arranged Can be determined. Then by reading the third to (kappa one 2) of the light emission element 3 4 1 3 ⁇ 3 4 1 kappa _ for coding consists of 2 bits column of 3 4 2 bits sequences (on 'off information)
  • the object 206 (see FIG. 9) from which the one-dimensional bit pattern 340 is emitted can be specified, and other information output from the object 206 can be read.
  • Figure 16 shows a light emitting unit used in such an environment.
  • the light-emitting unit 365 has a structure in which light-emitting elements 362 as infrared light-emitting diodes are densely arranged in a circle 363 having a predetermined diameter.
  • 2-dimensional bit pattern 3 0 1 1-3 0 1 - emitting element by corresponding to the bits of one by one light emitting Yuni' bets 3 6 1 M 3 6 that example shown in FIG. 9 when the (2) Even when the respective light emission amounts are not sufficient due to the distance or the like, sufficient recognition can be ensured.
  • each object 206 may change a pattern such as a two-dimensional bit pattern at a predetermined cycle.
  • FIG. 17 shows the relationship between the shooting cycle of the television camera and the switching cycle of the light emission of the light emitting element.
  • FIG. 3A shows a photographing cycle of the television camera 203 (see FIG. 1) repeated at a cycle Tc. A long time interval several times than the period T c, bit pattern or on using a light emitting element as shown in FIG. (B).
  • Off pattern T P Medical T [rho] 2 so as to vary the « To This allows the TV camera 203 to reliably read the individual patterns and not only identify the object 206, but also receive other information from the object 206 to the image processing device 211. It becomes possible to do.
  • one or more television cameras 203 are fixed to the ceiling 202, but they may be moved while estimating the position where the object 206 moves.
  • the image processing apparatus 2 13 only needs to calculate the coordinate position of each object 206 and inform the object 206 of the position.
  • the image processing apparatus 2 13 may use the position control information to plan the operation of each movable object, and may send information to each of these movable objects in the form of an operation command.
  • the information from the object 206 is acquired by using the infrared light emitting diode.
  • a similar movable object position detection system may be configured by using the visible light region. Of course, you can do it.
  • the movable object is specified from the light emission pattern of the light emitting means.
  • the light emitting means is located near the time of the previous imaging when a short time has elapsed since then. It may be possible to track on the image based on the fact that there is. This utilizes the two characteristics that the processing speed of the image processing device is high and the moving speed of the movable object is limited. By tracking the light emitting means associated with the once identified movable object on the image, the object can be identified continuously without performing processing such as pattern matching. The moving object once lost can be identified again by processing such as pattern matching.
  • the temporary stop during the movement of the movable object has not been mentioned.
  • the active use of this makes it easy to specify the desired movable object. For example, by stopping another movable object, moving only the movable object to be identified, and simultaneously detecting the moving light emitting means, the movable object can be easily identified. On the contrary, by detecting the stopped light emitting means, the movable object on which the light emitting means is arranged can be specified.
  • the attitude may be obtained.
  • the number of light emitting means installed on the movable object can be reduced to one unlike the embodiment. Even if it is hidden by another object, if only one light-emitting element is visible, a movable object can be identified and its posture can be obtained.
  • the coordinate system of the TV camera 203 is described as being coincident with the reference coordinate system (world coordinate system) of the environment digging. However, the TV camera 203 is also coordinated with the coordinate system of the object 206 (object coordinate system).
  • a unique coordinate system may be provided.
  • a unique coordinate system is assigned to these TV cameras 203 to control the pan angle / tilt angle, and the coordinate system is used as necessary in world coordinates. What is necessary is to convert it to a system. Industrial use.
  • the imaging means specifies each light emitting means and notifies the movable body side of information indicating their positions
  • the movable object side Not only can you know your position and the position and orientation of the light-emitting means attached to the movable object, but you can also obtain positional information about the path and surroundings of that movable object.
  • the burden of these processes can be reduced.
  • the imaging means is attached to a movable object such as a ceiling in a room or the like where the movable object moves, it can be moved more than if the movable object itself is imaged and the position of each object is grasped. It is easy to grasp the object.
  • the notifying means can move the position information indicating the position in the space of each light emitting means identified by the identifying means and the attitude as the spatial arrangement of the movable object. Since the object is notified to a predetermined object, the movable object side can not only grasp its own position and the position and orientation of the light emitting means attached to the movable object, but also provide information on the course and surroundings of the movable object. The position and orientation of the movable object can be acquired, and the burden of these processes can be reduced.
  • the light emitting means is a means for outputting infrared light outside the visible region, a human does not notice the presence of these light emitting means.
  • the imaging means is a means for selectively receiving infrared light, it is possible to remove the influence of noise due to visible light or ultraviolet light, and to increase the processing reliability by that much.
  • the light emitting means since the light emitting means has a two-dimensional arrangement structure in which light emitting elements are arranged in a matrix, different light emitting patterns can be obtained using the same light emitting means. it can. Therefore, the light-emitting means can be used as a common component, so that the cost can be reduced and the management of the component can be facilitated.
  • the light emission pattern output by one light emitting means can be sequentially changed with the passage of time, it is necessary to return several responses in time, such as a response to a general call or a response to an individual call. Becomes possible.
  • the light emitting means since the light emitting means has a one-dimensional arrangement structure in which the light emitting elements are arranged at regular intervals on the X axis, it can be attached even to an elongated object. Also, since the structure is simpler than the two-dimensional arrangement structure, the cost can be reduced.
  • the distance between the first light emitting element and the light emitting element arranged at the other end is predetermined, and if the total number of light emitting elements is known, a predetermined number of light emitting elements existing between them Information such as ID information can be identified by an on / off pattern that indicates the presence or absence of light emission.
  • Information such as ID information can be identified by an on / off pattern that indicates the presence or absence of light emission.
  • it is possible to analyze the on / off patterns of the light emitting elements without knowing the total number of light emitting elements. .
  • the light amount and the size of the light emitting area can be increased, and the distance between the light emitting diode and the imaging means can be increased. Even if the distance is large, pattern recognition can be performed reliably.
  • the notifying means notifies not only the position information of the light emitting means of the notification destination but also the position information of other light emitting means, so that the side receiving the notification obtains such information. From image processing for Further, since the imaging means notifies the position information acquired from a position such as the ceiling which is easy to see, it is highly possible to obtain information with higher reliability than the information acquired by the movable object itself.
  • the collation of the light emission pattern by the identification means does not need to be performed every time, and the identification can be easily performed by the tracking means. Since it is possible, even if there are multiple movable objects, they can be identified without applying an excessive load.
  • the corresponding object when tracking cannot be performed by the tracking means, the corresponding object is identified from the light emission pattern received by the imaging means using the identification means, so that efficient processing can be performed.
  • the corresponding object is identified from the light emission pattern received by the imaging means using the identification means, so that efficient processing can be performed.
  • specific movable object t The claim 1 0, wherein will be reliably performed relying only on tracking means, movable support over switch region of the light emitting means tracking means to track Efficient tracking is enabled by determining the size of the object according to the moving speed.
  • the light emitting means by providing a light emitting means light emitting selecting means for instructing light emission of only a specific light emitting means, the light emitting means temporarily lost due to being hidden by an obstacle or the like can be provided. It is possible to easily specify a movable object or a light emitting means or a movable object necessary for a specific process.
  • only the movable object to which the specific light emitting means is attached is moved or stopped during the specific period, and the light emitting means which is moved or stopped during the specific period is set as the movable object.
  • a specific object can be easily identified by identifying the specific light emitting means.
  • the identification can be performed easily.
  • each movable object has a unique coordinate, and when the imaging means observes in the reference coordinate system, what kind of rotation direction the movable object is in is determined.
  • the moving of the movable object in the plane in a specific direction in the object coordinate system is achieved by instructing the specific direction movement instructing means. Therefore, it is not necessary to arrange a plurality of light emitting means on one movable object, and the rotation angle can be specified. Further, the present invention has an advantage that the rotation angle can be specified even in a situation where some of the plurality of light emitting units arranged on the movable object cannot be captured by the imaging unit due to an obstacle or the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

L'invention concerne un système de détection de la position d'un corps mobile pouvant facilement reconnaître les positions et les comportements de corps mobiles tels que des robots. Une caméra (203) montée sur un filtre trasparent infrarouge (204) est située sur le plafond (202) d'une chambre (201). Un objet (206) pouvant se déplacer librement et pourvu d'éléments lumineux (207-209) émettant des rayons infrarouges est déposé sur le sol (205). Un signal vidéo (211) généré par la caméra (203) est transféré à un processeur (213) d'images, à partir duquel des informations de position concernant la position des coordonnées X et Y et le comportement de l'objet sont envoyées via une première antenne (215). L'objet (206) reçoit cette information de position via une seconde antenne (217). L'objet (206) lui-même n'a donc pas besoin de produire des informations de détection, et les rayons infrarouges émis par celui-ci ne sont pas nocifs pour les êtres humains.
PCT/JP2001/007877 2000-09-11 2001-09-11 Systeme de detection de la position d'un corps mobile WO2002023122A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
AU2001284519A AU2001284519A1 (en) 2000-09-11 2001-09-11 Mobile body position detecting system
JP2002527723A JPWO2002023122A1 (ja) 2000-09-11 2001-09-11 可動物体位置検出システム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2000-275318 2000-09-11
JP2000275318 2000-09-11

Publications (1)

Publication Number Publication Date
WO2002023122A1 true WO2002023122A1 (fr) 2002-03-21

Family

ID=18760961

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2001/007877 WO2002023122A1 (fr) 2000-09-11 2001-09-11 Systeme de detection de la position d'un corps mobile

Country Status (3)

Country Link
JP (1) JPWO2002023122A1 (fr)
AU (1) AU2001284519A1 (fr)
WO (1) WO2002023122A1 (fr)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007071660A (ja) * 2005-09-06 2007-03-22 Toshiba Corp 遠隔検査における作業位置計測方法およびその装置
JP2008058204A (ja) * 2006-08-31 2008-03-13 Fuji Xerox Co Ltd 位置計測システム
JP2008209189A (ja) * 2007-02-26 2008-09-11 Hitachi Ltd 水中移動装置の位置測定システム
US7750969B2 (en) 2004-07-21 2010-07-06 Japan Science And Technology Agency Camera calibration system and three-dimensional measuring system
JP2013038489A (ja) * 2011-08-04 2013-02-21 Hitachi-Ge Nuclear Energy Ltd 情報伝送システム、移動体位置検知装置、発信装置、および受信装置
JP2013532451A (ja) * 2010-07-01 2013-08-15 サーブ アクティエボラーグ 倉庫内の物体の位置を特定するための方法及び装置
JP2015531478A (ja) * 2012-09-07 2015-11-02 ライカ ジオシステムズ アクチエンゲゼルシャフトLeica Geosystems AG 測定範囲を拡大するためのハイブリッド結像方法を用いるレーザトラッカ
JP2016526673A (ja) * 2013-06-19 2016-09-05 ザ・ボーイング・カンパニーThe Boeing Company 移動可能な対象物体の場所を追跡するためのシステム及び方法
SE1951505A1 (en) * 2019-12-19 2021-06-20 Husqvarna Ab A calibration device for a floor surfacing system
US11762394B2 (en) 2018-11-01 2023-09-19 Nec Corporation Position detection apparatus, position detection system, remote control apparatus, remote control system, position detection method, and program
EP4129052A4 (fr) * 2020-03-23 2024-04-03 FJ Dynamics Technology Co., Ltd Système de navigation intérieure et procédé de navigation intérieure basés sur la vision

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0213939A2 (fr) * 1985-08-30 1987-03-11 Texas Instruments Incorporated Contrôleur pour un véhicule mobile utilisant les dates absolues retardées de position pour le guidage et la navigation
JPS63111402A (ja) * 1986-10-29 1988-05-16 Yaskawa Electric Mfg Co Ltd 位置計測装置
JPH01112490A (ja) * 1987-10-27 1989-05-01 Kenro Motoda 可動体の信号伝送方式及び位置検出・作動制御方式
JP2511082B2 (ja) * 1987-12-23 1996-06-26 株式会社日立製作所 移動物体の三次元位置測定装置
JPH10149435A (ja) * 1996-11-19 1998-06-02 Kunikatsu Takase 環境認識システム並びに該システムに用いるマーク
JP2000055657A (ja) * 1998-08-05 2000-02-25 Clarion Co Ltd 測位装置
JP6080403B2 (ja) * 2011-06-30 2017-02-15 株式会社前川製作所 枝肉の分割方法及び装置

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0213939A2 (fr) * 1985-08-30 1987-03-11 Texas Instruments Incorporated Contrôleur pour un véhicule mobile utilisant les dates absolues retardées de position pour le guidage et la navigation
JPS63111402A (ja) * 1986-10-29 1988-05-16 Yaskawa Electric Mfg Co Ltd 位置計測装置
JPH01112490A (ja) * 1987-10-27 1989-05-01 Kenro Motoda 可動体の信号伝送方式及び位置検出・作動制御方式
JP2511082B2 (ja) * 1987-12-23 1996-06-26 株式会社日立製作所 移動物体の三次元位置測定装置
JPH10149435A (ja) * 1996-11-19 1998-06-02 Kunikatsu Takase 環境認識システム並びに該システムに用いるマーク
JP2000055657A (ja) * 1998-08-05 2000-02-25 Clarion Co Ltd 測位装置
JP6080403B2 (ja) * 2011-06-30 2017-02-15 株式会社前川製作所 枝肉の分割方法及び装置

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7750969B2 (en) 2004-07-21 2010-07-06 Japan Science And Technology Agency Camera calibration system and three-dimensional measuring system
JP2007071660A (ja) * 2005-09-06 2007-03-22 Toshiba Corp 遠隔検査における作業位置計測方法およびその装置
US8085296B2 (en) 2005-09-06 2011-12-27 Kabushiki Kaisha Toshiba Method and apparatus for measuring an operating position in a remote inspection
JP2008058204A (ja) * 2006-08-31 2008-03-13 Fuji Xerox Co Ltd 位置計測システム
JP2008209189A (ja) * 2007-02-26 2008-09-11 Hitachi Ltd 水中移動装置の位置測定システム
JP2013532451A (ja) * 2010-07-01 2013-08-15 サーブ アクティエボラーグ 倉庫内の物体の位置を特定するための方法及び装置
JP2013038489A (ja) * 2011-08-04 2013-02-21 Hitachi-Ge Nuclear Energy Ltd 情報伝送システム、移動体位置検知装置、発信装置、および受信装置
JP2015531478A (ja) * 2012-09-07 2015-11-02 ライカ ジオシステムズ アクチエンゲゼルシャフトLeica Geosystems AG 測定範囲を拡大するためのハイブリッド結像方法を用いるレーザトラッカ
US9864062B2 (en) 2012-09-07 2018-01-09 Leica Geosystems Ag Laser tracker with hybrid imaging method for extending the measuring range
JP2016526673A (ja) * 2013-06-19 2016-09-05 ザ・ボーイング・カンパニーThe Boeing Company 移動可能な対象物体の場所を追跡するためのシステム及び方法
US11762394B2 (en) 2018-11-01 2023-09-19 Nec Corporation Position detection apparatus, position detection system, remote control apparatus, remote control system, position detection method, and program
SE1951505A1 (en) * 2019-12-19 2021-06-20 Husqvarna Ab A calibration device for a floor surfacing system
WO2021126037A1 (fr) * 2019-12-19 2021-06-24 Husqvarna Ab Dispositif d'étalonnage pour machine de surfaçage de sol
SE543841C2 (en) * 2019-12-19 2021-08-10 Husqvarna Ab A calibration device for a floor surfacing system
EP4129052A4 (fr) * 2020-03-23 2024-04-03 FJ Dynamics Technology Co., Ltd Système de navigation intérieure et procédé de navigation intérieure basés sur la vision

Also Published As

Publication number Publication date
AU2001284519A1 (en) 2002-03-26
JPWO2002023122A1 (ja) 2004-01-22

Similar Documents

Publication Publication Date Title
CN101661098B (zh) 机器人餐厅多机器人自动定位系统
US7845560B2 (en) Method and apparatus for determining position and rotational orientation of an object
JP2006513504A (ja) プロジェクタによる位置および向きの読み取り
WO2002023122A1 (fr) Systeme de detection de la position d'un corps mobile
US20200171667A1 (en) Vision-based robot control system
KR101753361B1 (ko) 청소 로봇을 이용한 스마트 청소 시스템
US6867699B2 (en) Method of and apparatus for actuating an operation
Kruse et al. Camera-based monitoring system for mobile robot guidance
CN111405735B (zh) 一种自动焦点定位式舞台追灯追踪系统及其使用方法
KR20090109830A (ko) 로봇 지도 생성 방법 및 로봇 지도 이용 방법 및 로봇지도를 가지는 로봇
JP2002178283A (ja) 自律ロボット
KR102023699B1 (ko) 코드 인식을 통한 위치 인식 및 이동경로 설정 방법과 무인 모빌리티와 운영시스템
JP6332128B2 (ja) 物体認識装置、及び物体認識方法
JP2004230539A (ja) ロボットによる物体の位置と姿勢検出方法及び検出装置
US20060202826A1 (en) Method of and apparatus for actuating an operation
JP2005241323A (ja) 撮像システム及び校正方法
JP2014139745A (ja) 機器管理システム、機器管理装置、機器管理方法及びプログラム
CN113597362B (zh) 用于确定机器人坐标系与可移动装置坐标系之间的关系的方法和控制装置
Jung et al. Tracking multiple moving targets using a camera and laser rangefinder
CN105916278A (zh) 采用视频定位的控制设备及其操作方法
TW200924926A (en) Autonomous robot and position system and method thereof
JPH10149435A (ja) 環境認識システム並びに該システムに用いるマーク
JP2008026731A (ja) マーカー装置
JP2015058488A (ja) ロボット制御システム、ロボット、ロボット制御方法及びプログラム
Fiala A robot control and augmented reality interface for multiple robots

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PH PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase