WO2010136922A1 - Re-calibration of pre-recorded images during interventions using a needle device - Google Patents
Re-calibration of pre-recorded images during interventions using a needle device Download PDFInfo
- Publication number
- WO2010136922A1 WO2010136922A1 PCT/IB2010/052030 IB2010052030W WO2010136922A1 WO 2010136922 A1 WO2010136922 A1 WO 2010136922A1 IB 2010052030 W IB2010052030 W IB 2010052030W WO 2010136922 A1 WO2010136922 A1 WO 2010136922A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- needle
- image
- sensor
- spectroscopy
- information
- Prior art date
Links
- 238000001454 recorded image Methods 0.000 title claims abstract description 35
- 238000003384 imaging method Methods 0.000 claims abstract description 36
- 238000012545 processing Methods 0.000 claims abstract description 17
- 238000000034 method Methods 0.000 claims description 29
- 239000013307 optical fiber Substances 0.000 claims description 21
- 238000004611 spectroscopical analysis Methods 0.000 claims description 9
- 238000004590 computer program Methods 0.000 claims description 7
- 238000001506 fluorescence spectroscopy Methods 0.000 claims description 4
- 230000008569 process Effects 0.000 claims description 3
- 238000001069 Raman spectroscopy Methods 0.000 claims description 2
- 238000000149 argon plasma sintering Methods 0.000 claims description 2
- 238000002591 computed tomography Methods 0.000 claims description 2
- 239000000463 material Substances 0.000 claims description 2
- 238000012014 optical coherence tomography Methods 0.000 claims description 2
- 238000001055 reflectance spectroscopy Methods 0.000 claims description 2
- 238000003325 tomography Methods 0.000 claims description 2
- 238000002604 ultrasonography Methods 0.000 claims description 2
- 210000001519 tissue Anatomy 0.000 description 55
- 230000003287 optical effect Effects 0.000 description 27
- 239000000835 fiber Substances 0.000 description 17
- 238000001228 spectrum Methods 0.000 description 9
- 206010028980 Neoplasm Diseases 0.000 description 6
- 238000001574 biopsy Methods 0.000 description 5
- 230000006870 function Effects 0.000 description 4
- 238000004519 manufacturing process Methods 0.000 description 4
- 238000002594 fluoroscopy Methods 0.000 description 3
- 230000003902 lesion Effects 0.000 description 3
- 238000010859 live-cell imaging Methods 0.000 description 3
- 238000013507 mapping Methods 0.000 description 3
- 229920000642 polymer Polymers 0.000 description 3
- 238000000985 reflectance spectrum Methods 0.000 description 3
- 210000004872 soft tissue Anatomy 0.000 description 3
- 230000007704 transition Effects 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 210000000988 bone and bone Anatomy 0.000 description 2
- 238000012512 characterization method Methods 0.000 description 2
- 230000004807 localization Effects 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 230000005855 radiation Effects 0.000 description 2
- 230000009026 tissue transition Effects 0.000 description 2
- 230000003936 working memory Effects 0.000 description 2
- 238000012952 Resampling Methods 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 239000012620 biological material Substances 0.000 description 1
- 210000004369 blood Anatomy 0.000 description 1
- 239000008280 blood Substances 0.000 description 1
- 210000004204 blood vessel Anatomy 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 210000002808 connective tissue Anatomy 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000002059 diagnostic imaging Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 239000005357 flat glass Substances 0.000 description 1
- 239000012530 fluid Substances 0.000 description 1
- 235000013305 food Nutrition 0.000 description 1
- 210000001035 gastrointestinal tract Anatomy 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 238000001727 in vivo Methods 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 210000003205 muscle Anatomy 0.000 description 1
- 230000035515 penetration Effects 0.000 description 1
- 210000001428 peripheral nervous system Anatomy 0.000 description 1
- 229920003229 poly(methyl methacrylate) Polymers 0.000 description 1
- 239000004926 polymethyl methacrylate Substances 0.000 description 1
- 210000001747 pupil Anatomy 0.000 description 1
- 230000029058 respiratory gaseous exchange Effects 0.000 description 1
- 239000000523 sample Substances 0.000 description 1
- 238000011524 similarity measure Methods 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
- 210000000278 spinal cord Anatomy 0.000 description 1
- 229920002994 synthetic fiber Polymers 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 239000011345 viscous material Substances 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
- A61B8/0833—Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
- A61B8/0841—Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating instruments
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/34—Trocars; Puncturing needles
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/34—Trocars; Puncturing needles
- A61B17/3403—Needle locating or guiding means
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/34—Trocars; Puncturing needles
- A61B17/3417—Details of tips or shafts, e.g. grooves, expandable, bendable; Multiple coaxial sliding cannulas, e.g. for dilating
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0062—Arrangements for scanning
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0082—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes
- A61B5/0084—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes for introduction into the body, e.g. by catheters
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6846—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be brought in contact with an internal body part, i.e. invasive
- A61B5/6847—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be brought in contact with an internal body part, i.e. invasive mounted on an invasive device
- A61B5/6848—Needles
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/12—Arrangements for detecting or locating foreign bodies
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5211—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
- A61B6/5229—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image
- A61B6/5247—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from an ionising-radiation diagnostic technique and a non-ionising radiation diagnostic technique, e.g. X-ray and ultrasound
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5215—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
- A61B8/5238—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/361—Image-producing devices, e.g. surgical cameras
- A61B2090/3614—Image-producing devices, e.g. surgical cameras using optical fibre
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/374—NMR or MRI
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/376—Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/378—Surgical systems with images on a monitor during operation using ultrasound
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/378—Surgical systems with images on a monitor during operation using ultrasound
- A61B2090/3782—Surgical systems with images on a monitor during operation using ultrasound transmitter or receiver in catheter or minimal invasive instrument
- A61B2090/3784—Surgical systems with images on a monitor during operation using ultrasound transmitter or receiver in catheter or minimal invasive instrument both receiver and transmitter being in the instrument or receiver being also transmitter
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0062—Arrangements for scanning
- A61B5/0066—Optical coherence imaging
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0062—Arrangements for scanning
- A61B5/0068—Confocal scanning
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0071—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence by measuring fluorescence emission
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0073—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence by tomography, i.e. reconstruction of 3D images from 2D projections
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0075—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence by spectroscopy, i.e. measuring spectra, e.g. Raman spectroscopy, infrared absorption spectroscopy
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
Definitions
- the present invention relates to an interventional system including an image device and a needle device. Further, the invention relates to a method of combining prerecorded images with live images of an object of interest. Particularly, the invention relates to a system and a method for providing a re-calibration of an overlay of pre-recorded images and live images.
- an interventionalist may use pre-recorded images and live imaging to navigate a medical device as a needle to an intended location.
- a detailed image of the body is taken. This image is often three-dimensional.
- the coordinate system of this three-dimensional image is coupled to the location of the table and/or the medical imaging equipment that is being used during the intervention.
- an overlay of the live images that are being taken during the intervention and the pre- recorded image can be made.
- the accuracy of the overlay depends obviously on the accuracy of the coordinate system and for instance the accuracy of the position of the table and the live imaging equipment. More importantly, the overlay accuracy also depends on the movement of the patient, for instance due to breathing. For some interventions, as for instance biopsies of small sized deep seated lesions, the accuracy of the overlay is not sufficient.
- tissue information may be provided by a so called photonic needle, i.e. by a needle device including optical fiber.
- the optical fiber may represent a sensor by means of which the needle device may detect features of tissue. Such features may also be detectable in the images to be combined.
- the essential feature of the invention is that the information of the sensor of the needle device will be combined with the information as provided by the live and pre-recorded images to enhance the overlay accuracy between pre-recorded and live images, by re-calibrating the coordinate system of the pre-recorded image with the coordinate system of a live image utilizing features that the sensor of the needle device detects, wherein these features are present in the pre-recorded images and more or less present in the live images.
- a coordinate system or landmark in each of the images an overlay of which should be performed is identified. In some instances a live image will not show all the details of a pre-recorded image.
- the sensor of the needle device the position of which is locatable in the live image, provides additional information about the tissue at the location of the sensor.
- This additional information will be used to provide a better identification of the coordinate system or landmark in the live image, wherein this coordinate system or landmark may also be identifiable in the pre-recorded image so that the accuracy of the overlay may be improved by the system according to the invention.
- the tissue volume under analysis can be exactly located in the coordinate system of the pre-recorded, i.e. preoperative image, because tissue information is also available from the pre-recorded image.
- the position of this tissue volume can be determined with respect to the coordinate system of the live images as well. In this way, since the tissue location is known in the coordinate system of the pre-operative image and in the coordinate system of the live images, a re-calibration of the overlay registration can be done.
- an interventional system comprises an imaging device providing images of an object, a needle device, and a processing device.
- the needle device comprises a sensor for providing data corresponding to tissue properties.
- the processing device is adapted to perform an overlay registration of pre-recorded images and live images provided by the imaging device, utilizing the data from the sensor.
- the interventional system may further comprise an analyzing device, wherein the analyzing device may be coupled to the sensor and may be adapted to process the data from the sensor, thereby generating information about tissue properties.
- the sensor of the needle device may comprise an optical fiber capable of emitting and receiving of light.
- the analyzing device may comprise a console for spectroscopy, wherein the console and the optical fiber may be connected to each other.
- the console for spectroscopy may be adapted to provide information from one of the group consisting of reflectance spectroscopy, fluorescence spectroscopy, autofluorescence spectroscopy, differential path length spectroscopy, Raman spectroscopy, optical coherence tomography, light scattering spectroscopy, and multi-photon fluorescence spectroscopy.
- the senor of the needle device may comprise elements of a microscopic imaging capability.
- Such elements may include an optical fiber, a bundle of optical fibers, a lens and an actuation means.
- the actuation means may displace the optical fiber(s) together with the lens, or may displace only the fiber(s) or the lens.
- the imaging capability may also be realized just with a bundle of fibers and a lens, without an actuation means. With such an imaging capability, it may be possible to form microscopic images of the tissue in front of the needle device.
- the imaging device may be a noninvasive imaging modality being one of the group consisting of an X-ray device, a computer tomography device, a magnet resonance tomography device, and an ultrasound device.
- the needle device may comprise a structure and material capable to be visualized by the imaging device.
- an integrated system comprises a non-invasive imaging modality that can image the inside of the body, a needle device including a sensor including at least one fiber, the fiber being connected to a console capable of probing the tissue in front of or near the tip of the needle device.
- the non- invasive imaging modality can image the needle device inside the body, allowing coarse guidance of the needle device based on the non-invasive imaging modality.
- the optical modality is used to fine position the tip portion of the needle device in the targeted tissue.
- the optical information is registered into the image of the non- invasive imaging modality.
- the optical information is registered in the 3-dimensional coordinate frame of the image.
- the needle device might be, on the one hand, a biopsy needle, a cannula, or a trocar or, on the other hand, might also be a catheter adapted to receive a needle by which for example a biopsy will be actually performed.
- the "tissue" investigated by the system may comprise all kind of living or dead tissue, e.g. human tissue, particularly epithelium-tissue (e.g. surface of the skin and inner lining of digestive tract), connective tissue (e.g. blood, bone tissue), muscle tissue and nervous tissue (e.g. brain, spinal cord and peripheral nervous system).
- tissue may further comprise food products, biomaterials, synthetic materials, fluid or viscous substances, etc.
- a method of combining prerecorded images with live images of an object of interest comprises the steps of making an overlay of pre-recorded images and live images, acquiring local tissue information, recalibrate the overlay of the images utilizing the acquired local tissue information.
- the method may further comprise the steps of receiving pre-recorded images from a data base, and receiving live images from an imaging device.
- the local tissue information may be acquired by means of a needle device.
- the method step of making an overlay may include defining a coordinate system in a pre-recorded image and identifying a corresponding coordinate system in a live image or vice versa.
- the method step of re-calibrating the overlay may include identifying structures in the pre-recorded image that correspond to the acquired information.
- the pre-recorded images, the live images and the local tissue information may be real-time processed to calculate an error in the overlay.
- the invention relates also to a computer program for a processing device, such that the method according to the invention might be executed on an appropriate system.
- the computer program is preferably loaded into a working memory of a data processor.
- the data processor is thus equipped to carry out the method of the invention.
- the invention relates to a computer readable medium, such as a CD-Rom, at which the computer program may be stored.
- the computer program may also be presented over a network like the worldwide web and can be downloaded into the working memory of a data processor from such a network.
- Figure 1 shows a needle device according to the invention, including sensor modalities.
- Figure 2 shows a detail view of the tip portion of a needle device including a lens system of a sensor, according to an exemplary embodiment of the needle device.
- Figure 3 shows an interventional system according to the invention.
- Figure 4 illustrates examples of images showing a needle device in an object, wherein the tip of the needle device has different distances to a target structure.
- Figure 5 is a first illustration of wavelengths for several spectra.
- Figure 6 is a second illustration of wavelengths for several spectra.
- Figure 7 is an illustration of wavelengths for three exemplary spectra.
- Figure 8 is a flow chart of the method according to the invention.
- a needle device 200 as part of a system according to an embodiment of the invention, comprises a shaft 210, a bevel at the tip portion of the shaft, at least one fiber 230, and a holder part 290.
- the shaft may have a length of 150 mm and a diameter of 1.3 mm.
- the bevel may enclose an angle with the shaft axis of 20°.
- the fiber 230 which runs from the distal end, i.e. the surface of the bevel, through the shaft 210 to the holder part 290, passes through an opening of the holder part 290 out of the needle.
- FIG 1 are schematically illustrated the elements of a system according to the invention.
- the system includes the needle device 200, a light source 110, a light detector 120, a processing unit 620 and a monitor 610.
- the processing unit 620 is capable of controlling the light source 110 to emit light into the fiber 230 such that light will be emitted through the distal end surface of the fiber 230 at the top of the bevel into surrounding tissue.
- the processing unit 620 will process the data corresponding to the electrical signals, so that the processed data might be visualized on a monitor 610. Based on said visualized data, it might be possible to diagnose whether a special type of tissue is in front of the tip portion of the needle 200. It should be noted that also a subset of fibers consisting of a plurality of fibers, may be used to direct light into the tissue, while another subset of fibers is used to collect the light emanating from the tissue in which the needle is located.
- the ratio of incident light versus outgoing light is defined as the reflectance.
- the reflectance spectra of different types of tissue are in general different due to the different molecular constitution of the tissues. As a result of measuring these spectra, it may be possible to identify different tissues from each other.
- the fact that the optical method has only a limited penetration depth (the imaging depth is only a few millimeters up to a few centimeters), guiding the needle or cannula without the guidance of the non-invasive modality is difficult because there is no overview where the needle or cannula is in space.
- FIG. 2 is a schematic cross-sectional drawing of an exemplary embodiment of the needle device, according to which a sensor 220 is realized by a lens system having a lens 250 and an actuation system 260, 270.
- the lens 250 In order to have a compact lens system an aspherical surface of the lens 250 is applied.
- a compact lens system can be designed suitable for mass production.
- the polymer should be a low density polymer to provide easy displacement of the lens system.
- the lens system is positioned a distance L away from the optical exit of the optical fiber 230 as defined by the mount 240.
- the distance (L) is significantly larger than a core diameter of the optical fiber 230.
- the lens system may be part mounted in the shaft 210 of the needle device together with an actuation system including an electromechanical motor system with coils 270 that are cooperating with magnets 260, the magnets being mechanically attached to the optical fiber 230 so as to perform scanning with the optical fiber 230 and the lens 250 by action of the motor system, wherein both an actuation of the optical fiber alone and an actuation of the optical fiber together with the lens is possible.
- an actuation system including an electromechanical motor system with coils 270 that are cooperating with magnets 260, the magnets being mechanically attached to the optical fiber 230 so as to perform scanning with the optical fiber 230 and the lens 250 by action of the motor system, wherein both an actuation of the optical fiber alone and an actuation of the optical fiber together with the lens is possible.
- the lens 250 is a single Plano-aspheric lens in front a thin flat exit window glass plate 280 as evident in Figure 2.
- the aspheric lens is made of PMMA and has entrance pupil diameter of 0.82 mm.
- the numerical aperture (NA) is 0.67 and the focal length (measured in air) is 0.678 mm.
- the lens system is optimized for wavelength of 780 run.
- the exit window 280 is flat and has no optical power.
- the free working distance of the objective lens 250 must be larger than the exit window 280 thickness.
- the objective lens 250 will be scanned in front of the exit window.
- the exit window must have a certain thickness to be robust. Typically, the thickness is larger than 0.1 mm.
- This embodiment is particularly, but not exclusively, advantageous for obtaining an improved optical sensor, particularly suited for miniature applications e.g. for in- vivo medical application.
- the field of view of the optical sensor may be determined directly by the transverse stroke of the optical fiber. Hence only a relatively small stroke is required. The field of view is thus effectively no longer limited by the stroke.
- the lens system itself is only used for imaging close to the optical axis (i.e. small field of view), it may allow for simpler (i.e. less complex and thus fewer lens elements) optical designs that eases manufacturing while still having high image resolution.
- the optical sensor is particularly suited for relative simple and large-scale manufacturing because of the lens system being displaceably mounted on the end portion optical fiber. From a practical point of view, this may reduce the needed precision during manufacturing which, in turn, may lower the unit-price per probe. This is especially important because an endoscope, a catheter or needle with the optical sensor embedded will usually be disposed after a single use due to sanitary requirements.
- Figure 3 shows an interventional system according to an exemplary embodiment of the invention.
- the system comprises an elongated needle device 200, a sensor 220 which is located at the tip portion of the needle device, an imaging device 500 for assisting the coarse guidance, an analyzing device 100 for assisting the fine guidance, and a computing device 600.
- the analyzing device includes a light source 110 and a spectrograph as a light detector 120.
- the imaging device 500 includes a radiation source 510 and a detector array 520.
- the computing device includes a processor unit 620 for processing the signals coming from the imaging device 500 and from the analyzing device 100, and a monitor 610 for monitoring information for assisting the guidance of the biopsy device in a body.
- the interventional system comprises an image guided X-ray based needle guidance system 500 and a needle device 200 comprising a sensor, i.e. an optical fiber, which is connected with an analyzing device 100.
- the image guided needle navigation system provides integrated 2D/3D lesion imaging and an interactive image guided needle advancement monitoring, all of which is coupled to the optical information obtained by the needle, wherein the X-ray system 500 provides the coarse guidance, while the optical information received from the analyzing device 100, provides the final precise guidance to the device location.
- the system is able to interactively follow the needle device from the incision to the target point by superimposing 2D fluoroscopic images on 3D tissue reconstruction and provide molecular tissue information at every point along the needle trajectory that is registered to the position inside the body of the patient.
- the region along the needle trajectory can be scanned (scan forward and scan aside) in order to provide indications on lesion existence at the molecular level.
- the X-ray data and the position information of the needle is actively used in the optical reconstruction of what tissue is in front of the needle.
- X-ray and photonic needle information is further coupled to MRI images of the same area (MR data sets can be registered with the data sets produced by the X-ray machine).
- the needle device equipped with an optical fiber may also be used, for example, to position a localization wire.
- the localization wire containing fixation means may also be equipped with a fiber.
- Another aspect of making the information from the sensor at the needle device usable for the invention is that in translating the measured optical data into a tissue type can be difficult when no information about the surrounding morphology is known.
- the decision making of the tissue characterization improves having the morphology information coming from the non- invasive imaging system as input.
- the optical data is registered to the non- invasive imaging data, then the optical information together with the morphology information around the needle coming from the non-invasive imaging modality is used in translating the measured optical data into a tissue type in front of or near the needle. For instance when the needle is in soft tissue the optical information can be affected whether a bone structure is close by or not. Taking this into account a more reliable tissue characterization is possible.
- a phantom i.e. the object from which a biopsy should be taken, is placed on, for example, a C-arm bed and the needle is mounted on a stepper motor that moves the needle in the axial direction (minimal steps of 0.25 micron).
- the needle is connected with optical fibers to a spectrometer. At least one of the fibers detects light reflected from the tissue, hence is an optical element.
- the needle intervention consists of acquiring X-ray and fluoroscopic X-ray images while in addition optical reflectance spectra are measured by the needle containing fibers coupled to a console that is connected to the X-ray system.
- FIG. 4 shows three illustrations which might be shown on a monitor to assist in guiding a needle device. Each illustration is mainly an image of an X-ray device, having added in the up left corner an illustration of the spectrum achieved by the analyze device on the basis of the tissue information from the needle.
- the fluoroscopy image of the X-ray device allows determining the relative position of the needle (elongated black line from the middle of each illustration to up right) with respect to the phantom (dark shadow), while the spectral information clearly shows when the small tube (black contrast line from up left to down right) is approached. It allows locating the needle within 100 micron accuracy.
- the information of the X-ray image and the optical information are exemplarily shown in a combined image, there are various other ways to present the combined information for instance by using colors.
- Using the information from the sensor at the needle device may also provide for the possibility to start the needle progression right away without live guidance, just on the basis of a pre-recorded image.
- a physician may judge where the needle is located approximately in the pre-recorded image.
- Figures 5 to 7 show examples of acquired spectra during a needle intervention for different positions of the needle in tissue. The higher the spectrum number the further the needle is in the tissue.
- transitions may be clearly observed when going from one tissue type to another tissue type.
- FIG 7 the spectra for three different positions is illustrated. In this example the transitions are clear and also the spectra are sufficiently different for discriminating the transitions. These for instance soft tissue transitions may not be visible in the X-ray image. Therefore linking the X-ray image to a pre-recorded, for instance, MRI image showing these soft tissue transitions, these landmarks may not be used. With the optical information this now becomes possible.
- Figure 8 is a flow chart, showing the steps of a method of combining prerecorded images with live images of an object of interest according to the invention. It will be understood, that the steps described with respect to the method, are major steps, wherein these major steps might be differentiated or divided into several sub steps. Furthermore, there might be also sub steps between these major steps. Therefore, a sub step is only mentioned, if said step is important for the understanding of the principles of the method according to the invention.
- step Sl of the method according to the invention a pre-recorded image of the region of interest of the patient is measured and track of the coordinate system is kept.
- step S2 an intervention using live imaging is performed.
- step S3 an overlay of pre-recorded and live image is made.
- Making an overlay may include the following sub-steps:
- a feature detection step in which salient and distinctive objects (closed- boundary regions, edges, contours, line intersections, corners, etc.) are manually or, preferably, automatically detected.
- these features may be represented by their point representatives (centers of gravity, line endings, distinctive points), which are called control points.
- a feature matching step in which the correspondence between the features detected in the live image and those detected in the pre-recorded image is established. Various feature descriptors and similarity measures along with spatial relationships among the features are used for that purpose.
- a transform model estimation step in which the type and parameters of the so-called mapping functions, aligning the live image with the pre-recorded image, are estimated. The parameters of the mapping functions are computed by means of the established feature correspondence.
- step S4 of the method according to the invention local tissue information as a distinctive feature is acquired from a photonic needle.
- step S5 structures in the live image are identified within the sphere as defined by the overlay accuracy in the pre-recorded image that correspond to the information as provided by the photonic needle, as for instance a boundary between tissue types, or blood vessels, or other structures.
- step S6 the coordinate system of the pre-recorded image relative to the live image is re-calibrated in such a way that the structure as detected by the photonic needle is exactly at the tip of the needle in the pre-recorded image (obviously, the needle tip is visible in the live image).
- a computer program may be stored/distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems. Any reference signs in the claims should not be construed as limiting the scope.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Surgery (AREA)
- Medical Informatics (AREA)
- Biomedical Technology (AREA)
- Pathology (AREA)
- Veterinary Medicine (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Physics & Mathematics (AREA)
- Biophysics (AREA)
- Radiology & Medical Imaging (AREA)
- Computer Vision & Pattern Recognition (AREA)
- High Energy & Nuclear Physics (AREA)
- Optics & Photonics (AREA)
- Investigating Or Analysing Materials By Optical Means (AREA)
- Magnetic Resonance Imaging Apparatus (AREA)
- Apparatus For Radiation Diagnosis (AREA)
- Endoscopes (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
- Investigating, Analyzing Materials By Fluorescence Or Luminescence (AREA)
Abstract
Description
Claims
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
RU2011153794/14A RU2535605C2 (en) | 2009-05-28 | 2010-05-07 | Recalibration of pre-recorded images during interventions using needle device |
US13/321,189 US9980698B2 (en) | 2009-05-28 | 2010-05-07 | Re-calibration of pre-recorded images during interventions using a needle device |
BRPI1008269A BRPI1008269A2 (en) | 2009-05-28 | 2010-05-07 | international system and computer program |
EP10726240.4A EP2434943B1 (en) | 2009-05-28 | 2010-05-07 | Re-calibration of pre-recorded images during interventions using a needle device |
JP2012512481A JP5658747B2 (en) | 2009-05-28 | 2010-05-07 | Recalibration of recorded images during intervention using a needle device |
CN201080022939.8A CN102448366B (en) | 2009-05-28 | 2010-05-07 | Re-calibration of pre-recorded images during interventions using a needle device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP09161321.6 | 2009-05-28 | ||
EP09161321 | 2009-05-28 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010136922A1 true WO2010136922A1 (en) | 2010-12-02 |
Family
ID=42697286
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2010/052030 WO2010136922A1 (en) | 2009-05-28 | 2010-05-07 | Re-calibration of pre-recorded images during interventions using a needle device |
Country Status (7)
Country | Link |
---|---|
US (1) | US9980698B2 (en) |
EP (1) | EP2434943B1 (en) |
JP (1) | JP5658747B2 (en) |
CN (1) | CN102448366B (en) |
BR (1) | BRPI1008269A2 (en) |
RU (1) | RU2535605C2 (en) |
WO (1) | WO2010136922A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012187334A (en) * | 2011-03-13 | 2012-10-04 | River Seiko:Kk | Tip structure of endoscope or the like |
JP2012235983A (en) * | 2011-05-13 | 2012-12-06 | Olympus Medical Systems Corp | Medical image display system |
Families Citing this family (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CA2794226C (en) * | 2012-10-31 | 2020-10-20 | Queen's University At Kingston | Automated intraoperative ultrasound calibration |
US9216010B2 (en) * | 2013-06-26 | 2015-12-22 | General Electric Company | System and method for aligning a biopsy collecting device |
CN105578970A (en) * | 2013-07-26 | 2016-05-11 | 学术发展皇家机构/麦吉尔大学 | Biopsy device and method for obtaining a tomogram of a tissue volume using same |
US20160324584A1 (en) * | 2014-01-02 | 2016-11-10 | Koninklijke Philips N.V. | Ultrasound navigation/tissue characterization combination |
JP6688557B2 (en) | 2014-01-07 | 2020-04-28 | キヤノンメディカルシステムズ株式会社 | X-ray CT system |
US10013808B2 (en) | 2015-02-03 | 2018-07-03 | Globus Medical, Inc. | Surgeon head-mounted display apparatuses |
GB2555012B (en) * | 2015-03-17 | 2020-09-09 | Synaptive Medical Barbados Inc | Method and device for registering surgical images |
US11033254B2 (en) * | 2015-06-26 | 2021-06-15 | Koninklijke Philips N.V. | Image guidance system |
US9934570B2 (en) * | 2015-10-09 | 2018-04-03 | Insightec, Ltd. | Systems and methods for registering images obtained using various imaging modalities and verifying image registration |
EP3393355A1 (en) | 2015-12-21 | 2018-10-31 | Erasmus University Medical Center Rotterdam | Optical probe for measuring a tissue sample |
KR20180066781A (en) * | 2016-12-09 | 2018-06-19 | 삼성전자주식회사 | Method and apparatus for displaying medical image |
EP3579781B1 (en) * | 2017-02-09 | 2020-07-15 | Koninklijke Philips N.V. | Position detection based on tissue discrimination |
DE102017221924B3 (en) * | 2017-12-05 | 2019-05-02 | Siemens Healthcare Gmbh | Method for merging an analysis data set with an image data record, positioning device and computer program |
US20190254753A1 (en) | 2018-02-19 | 2019-08-22 | Globus Medical, Inc. | Augmented reality navigation systems for use with robotic surgical systems and methods of their use |
US10594112B1 (en) * | 2019-04-29 | 2020-03-17 | Hua Shang | Intervention photon control method and device |
US11992373B2 (en) | 2019-12-10 | 2024-05-28 | Globus Medical, Inc | Augmented reality headset with varied opacity for navigated robotic surgery |
US12133772B2 (en) | 2019-12-10 | 2024-11-05 | Globus Medical, Inc. | Augmented reality headset for navigated robotic surgery |
US11464581B2 (en) | 2020-01-28 | 2022-10-11 | Globus Medical, Inc. | Pose measurement chaining for extended reality surgical navigation in visible and near infrared spectrums |
US11382699B2 (en) | 2020-02-10 | 2022-07-12 | Globus Medical Inc. | Extended reality visualization of optical tool tracking volume for computer assisted navigation in surgery |
US11207150B2 (en) | 2020-02-19 | 2021-12-28 | Globus Medical, Inc. | Displaying a virtual model of a planned instrument attachment to ensure correct selection of physical instrument attachment |
US11607277B2 (en) | 2020-04-29 | 2023-03-21 | Globus Medical, Inc. | Registration of surgical tool with reference array tracked by cameras of an extended reality headset for assisted navigation during surgery |
US11510750B2 (en) | 2020-05-08 | 2022-11-29 | Globus Medical, Inc. | Leveraging two-dimensional digital imaging and communication in medicine imagery in three-dimensional extended reality applications |
US11153555B1 (en) | 2020-05-08 | 2021-10-19 | Globus Medical Inc. | Extended reality headset camera system for computer assisted navigation in surgery |
US11382700B2 (en) | 2020-05-08 | 2022-07-12 | Globus Medical Inc. | Extended reality headset tool tracking and control |
US11737831B2 (en) | 2020-09-02 | 2023-08-29 | Globus Medical Inc. | Surgical object tracking template generation for computer assisted navigation during surgical procedure |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080095421A1 (en) * | 2006-10-20 | 2008-04-24 | Siemens Corporation Research, Inc. | Registering 2d and 3d data using 3d ultrasound data |
WO2008111070A2 (en) * | 2007-03-12 | 2008-09-18 | David Tolkowsky | Devices and methods for performing medical procedures in tree-like luminal structures |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5207673A (en) * | 1989-06-09 | 1993-05-04 | Premier Laser Systems, Inc. | Fiber optic apparatus for use with medical lasers |
RU2145247C1 (en) * | 1998-04-10 | 2000-02-10 | Жаров Владимир Павлович | Photomatrix therapeutic device for treatment of extended pathologies |
JP2002209870A (en) * | 2001-01-18 | 2002-07-30 | Hitachi Medical Corp | Magnetic resonance imaging instrument |
US20020115931A1 (en) * | 2001-02-21 | 2002-08-22 | Strauss H. William | Localizing intravascular lesions on anatomic images |
WO2004019799A2 (en) | 2002-08-29 | 2004-03-11 | Computerized Medical Systems, Inc. | Methods and systems for localizing of a medical imaging probe and of a biopsy needle |
US20060241450A1 (en) * | 2003-03-17 | 2006-10-26 | Biotelligent Inc. | Ultrasound guided tissue measurement system |
JP4317412B2 (en) * | 2003-09-29 | 2009-08-19 | 株式会社日立製作所 | Image processing method |
JP5345782B2 (en) * | 2005-01-11 | 2013-11-20 | ヴォルケイノウ・コーポレーション | Blood vessel information acquisition device |
JP2008543511A (en) * | 2005-06-24 | 2008-12-04 | ヴォルケイノウ・コーポレーション | Vascular image preparation method |
DE102005045362B4 (en) | 2005-09-22 | 2012-03-22 | Siemens Ag | Device for determining the position of a medical instrument, associated imaging examination device and associated method |
US7874987B2 (en) * | 2005-10-28 | 2011-01-25 | Biosense Webster, Inc. | Targets and methods for ultrasound catheter calibration |
US20070118100A1 (en) | 2005-11-22 | 2007-05-24 | General Electric Company | System and method for improved ablation of tumors |
US20070238997A1 (en) * | 2006-03-29 | 2007-10-11 | Estelle Camus | Ultrasound and fluorescence imaging |
WO2007135609A2 (en) | 2006-05-24 | 2007-11-29 | Koninklijke Philips Electronics, N.V. | Coordinate system registration |
JP5269376B2 (en) | 2007-09-28 | 2013-08-21 | 株式会社東芝 | Image display apparatus and X-ray diagnostic treatment apparatus |
JP5587798B2 (en) * | 2008-03-03 | 2014-09-10 | コーニンクレッカ フィリップス エヌ ヴェ | Image-based X-ray guidance system and biopsy guidance with a light needle |
JP4604101B2 (en) * | 2008-03-26 | 2010-12-22 | 株式会社日立製作所 | Image information creation method, tomographic image information creation method of tomography apparatus, and tomography apparatus |
-
2010
- 2010-05-07 CN CN201080022939.8A patent/CN102448366B/en active Active
- 2010-05-07 JP JP2012512481A patent/JP5658747B2/en active Active
- 2010-05-07 WO PCT/IB2010/052030 patent/WO2010136922A1/en active Application Filing
- 2010-05-07 BR BRPI1008269A patent/BRPI1008269A2/en not_active IP Right Cessation
- 2010-05-07 RU RU2011153794/14A patent/RU2535605C2/en not_active IP Right Cessation
- 2010-05-07 US US13/321,189 patent/US9980698B2/en active Active
- 2010-05-07 EP EP10726240.4A patent/EP2434943B1/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080095421A1 (en) * | 2006-10-20 | 2008-04-24 | Siemens Corporation Research, Inc. | Registering 2d and 3d data using 3d ultrasound data |
WO2008111070A2 (en) * | 2007-03-12 | 2008-09-18 | David Tolkowsky | Devices and methods for performing medical procedures in tree-like luminal structures |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012187334A (en) * | 2011-03-13 | 2012-10-04 | River Seiko:Kk | Tip structure of endoscope or the like |
JP2012235983A (en) * | 2011-05-13 | 2012-12-06 | Olympus Medical Systems Corp | Medical image display system |
Also Published As
Publication number | Publication date |
---|---|
JP2012527936A (en) | 2012-11-12 |
EP2434943B1 (en) | 2013-05-01 |
BRPI1008269A2 (en) | 2019-09-24 |
RU2011153794A (en) | 2013-07-10 |
RU2535605C2 (en) | 2014-12-20 |
US20120059251A1 (en) | 2012-03-08 |
JP5658747B2 (en) | 2015-01-28 |
CN102448366A (en) | 2012-05-09 |
US9980698B2 (en) | 2018-05-29 |
CN102448366B (en) | 2014-06-25 |
EP2434943A1 (en) | 2012-04-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2434943B1 (en) | Re-calibration of pre-recorded images during interventions using a needle device | |
US20220358743A1 (en) | System and method for positional registration of medical image data | |
JP5587798B2 (en) | Image-based X-ray guidance system and biopsy guidance with a light needle | |
EP2440118B1 (en) | Algorithm for photonic needle console | |
JP5701615B2 (en) | Biopsy guidance with electromagnetic tracking and light needle | |
JP6711880B2 (en) | Biopsy probe, biopsy support device | |
US10357317B2 (en) | Handheld scanner for rapid registration in a medical navigation system | |
US20120101372A1 (en) | Diagnosis support apparatus, diagnosis support method, lesioned part detection apparatus, and lesioned part detection method | |
EP1727471A1 (en) | System for guiding a medical instrument in a patient body | |
CN104067313B (en) | Imaging device | |
CN108135563A (en) | Light and shadow guided needle positioning system and method | |
CN107427202B (en) | Device, system and method for illuminating a structure of interest inside a human or animal body | |
WO2015148630A1 (en) | Quantitative tissue property mapping for real time tumor detection and interventional guidance |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201080022939.8 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10726240 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010726240 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012512481 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13321189 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 9558/CHENP/2011 Country of ref document: IN |
|
ENP | Entry into the national phase |
Ref document number: 2011153794 Country of ref document: RU Kind code of ref document: A |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: PI1008269 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: PI1008269 Country of ref document: BR Kind code of ref document: A2 Effective date: 20111123 |