CN113876356A - Projection method for medical imaging, ultrasonic equipment system and storage medium - Google Patents
Projection method for medical imaging, ultrasonic equipment system and storage medium Download PDFInfo
- Publication number
- CN113876356A CN113876356A CN202111358184.5A CN202111358184A CN113876356A CN 113876356 A CN113876356 A CN 113876356A CN 202111358184 A CN202111358184 A CN 202111358184A CN 113876356 A CN113876356 A CN 113876356A
- Authority
- CN
- China
- Prior art keywords
- virtual
- model
- subject
- projection
- real
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
- A61B8/0833—Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
- A61B8/085—Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating body or organic structures, e.g. tumours, calculi, blood vessels, nodules
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/44—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
- A61B8/4444—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to the probe
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/48—Diagnostic techniques
- A61B8/483—Diagnostic techniques involving the acquisition of a 3D volume of data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5215—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/53—Querying
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Molecular Biology (AREA)
- General Health & Medical Sciences (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Veterinary Medicine (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Public Health (AREA)
- Theoretical Computer Science (AREA)
- Vascular Medicine (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
The invention relates to the technical field of ultrasonic scanning, in particular to a projection method for medical imaging, which comprises the steps of acquiring a contour map of a detected body, searching a virtual 3D model matched with the detected body in a database according to input information of the detected body and the contour map, wherein the virtual 3D model contains internal structure information of the virtual detected body; matching the virtual 3D model with the detected object to establish a corresponding coordinate relation; and projecting at least a part of the internal structure of the virtual 3D model according to the internal structure information of the virtual 3D model and the coordinate relation between the virtual 3D model and the detected object to form a projection result for guiding the detection position of the real ultrasonic probe. Compared with the traditional ultrasonic scanning, the ultrasonic scanning method has stronger intuition and low professional requirement on operators, and the operators can determine the scanning range of the ultrasonic probe according to the projection image in the scanning process.
Description
Technical Field
The invention relates to the technical field of ultrasonic scanning, in particular to a projection method and an ultrasonic equipment system for medical imaging.
Background
Ultrasound examination is a non-surgical diagnostic examination that is painless, non-invasive, and non-radioactive to the examinee. In addition, ultrasound can clearly display various cross-sectional images of internal organs and the surroundings of organs. Because the image entity is rich and is close to the real anatomical structure, the etiology can be diagnosed clearly at an early stage by applying ultrasonic examination, and the application range of the ultrasonic examination is wider and wider from professional medical disease diagnosis to daily health index evaluation.
In the traditional ultrasonic inspection, an operator usually holds an ultrasonic probe to scan a part to be scanned, and in the scanning process, the operator usually needs to continuously adjust the scanning track of the ultrasonic probe according to an ultrasonic image displayed on a display screen, so that the final purpose is to ensure that the part to be scanned is completely scanned by the ultrasonic probe. However, this method requires that the operator is familiar with the ultrasound images of various parts of the human body, and can determine the scanning track of the ultrasound probe according to the ultrasound images, which has high requirements on the expertise of the operator; moreover, the scanning track of the ultrasonic probe depends on the subjective selection of the operator only according to the ultrasonic image, and often positions which are not scanned exist, which is not beneficial to the accurate medical diagnosis of the subsequent part to be scanned.
Disclosure of Invention
In view of this, embodiments of the present invention provide a projection method and an ultrasound apparatus for medical imaging, so as to solve the problems in the prior art that ultrasound inspection has high requirements on operators and is incomplete in scanning.
According to a first aspect, an embodiment of the present invention provides a projection method for medical imaging, including: acquiring a contour map of a subject, and searching a virtual 3D model matched with the subject in a database according to input information of the subject and the contour map, wherein the virtual 3D model contains internal structure information of the virtual subject; matching the virtual 3D model with the detected object to establish a corresponding coordinate relation; and projecting at least a part of the internal structure of the virtual 3D model according to the internal structure information of the virtual 3D model and the coordinate relation between the virtual 3D model and the detected object to form a projection result for guiding the detection position of the real ultrasonic probe.
According to a preferred embodiment of the present invention, the internal structure information includes: position information and/or shape information, wherein the projecting at least a part of the internal structure of the virtual 3D model according to the internal structure information of the virtual 3D model and the coordinate relationship of the virtual 3D model to the subject, comprises: position information and/or shape information of at least a part of an internal structure of the virtual 3D model is projected and displayed according to the internal structure information of the virtual 3D model and the coordinate relationship of the virtual 3D model and the subject.
According to a preferred embodiment of the present invention, the internal structure information includes: position information, which is a spot formed by laser projection.
According to a preferred embodiment of the present invention, the projecting at least a part of the internal structure of the virtual 3D model based on the internal structure information of the virtual 3D model and the coordinate relationship of the virtual 3D model to the subject includes: position information of at least a part of an internal structure of the virtual 3D model is projected to the subject based on the internal structure information of the virtual 3D model and a coordinate relationship of the virtual 3D model and the subject.
According to a preferred embodiment of the invention, the spot comprises a central portion and a rim portion surrounding the central portion, the central portion and the rim portion being of different colors.
According to a preferred embodiment of the present invention, a contour map of the subject is acquired by the camera, and if the distance between the projection device and the subject is a set distance, the spot is in a non-blinking state; if the distance between the projection device and the subject is not set, the spot is in a flickering state.
According to a preferred embodiment of the present invention, the projection result is a projection plane with internal organ information, and the projecting at least a part of the internal structure of the virtual 3D model based on the internal structure information of the virtual 3D model and the coordinate relationship between the virtual 3D model and the subject includes: at least a part of the internal structure of the virtual 3D model is projected to form a projection plane with internal organ information according to the internal structure information of the virtual 3D model and the coordinate relation between the virtual 3D model and the detected object.
According to a preferred embodiment of the present invention, the shape information is a surface formed by projection by a projector.
According to a preferred embodiment of the present invention, the virtual 3D model is matched to the subject by means of a projective transformation.
According to a preferred embodiment of the present invention, an organ to be scanned of a subject is input, whether or not a virtual 3D model contains shape information of the organ to be scanned is determined, if so, the shape information of the organ to be scanned in the virtual 3D model is projected onto the subject, and if not, an anatomical map of an internal organ contained in the virtual 3D model and position information of the organ to be scanned with respect to the anatomical map of the internal organ in the virtual 3D model are inquired, and the position information is projected onto the subject.
According to a preferred embodiment of the present invention, the medical imaging is ultrasound imaging, when a real ultrasound probe is placed in a subject, the position of the real ultrasound probe on the subject is acquired, the real ultrasound probe is virtualized and displayed on the virtual 3D model, and the real ultrasound probe is displayed in real time and guided to move to a target position by the position relationship of the virtualized ultrasound probe on the virtual 3D model and the scanned organ.
According to a preferred embodiment of the present invention, the virtualized ultrasound probe is acquired by an ultrasound image obtained by a real ultrasound probe, a sensor and/or a camera located on the real ultrasound probe.
According to a preferred embodiment of the invention, the display of the real ultrasound probe on the virtual 3D model comprises position information and pose information.
According to a preferred embodiment of the present invention, guiding the real ultrasound probe to move to the target position by real-time display of the real ultrasound probe on the virtual 3D model is performed by: and displaying the virtualized ultrasonic probe at the target position on the virtual 3D model, selecting the virtualized ultrasonic probe matched with the real ultrasonic probe according to the real ultrasonic probe, displaying the virtualized ultrasonic probe at the target position, and judging to move to the target position by overlapping the image of the real ultrasonic probe on the projection drawing with the virtualized ultrasonic probe at the target position when the real ultrasonic probe is moved.
According to a preferred embodiment of the present invention, the target positions are plural, a virtualized ultrasound probe corresponding to the target position is provided at each target position, and the virtualized ultrasound probes at the target positions are sequentially displayed according to a scanning sequence.
According to a preferred embodiment of the present invention, the virtualized ultrasonic probe at the target position is in a non-display state, a profile of the subject is acquired by the camera, and the virtualized ultrasonic probe at the first target position is displayed when the real ultrasonic probe is placed within the field of view of the camera; when the real ultrasonic probe coincides with the virtualized ultrasonic probe at the first target position, the pose of the real ultrasonic probe is adjusted to obtain a required image, at this time, the virtualized ultrasonic probe at the second target is displayed, and the virtualized ultrasonic probe at the first target is hidden.
According to a preferred embodiment of the present invention, the projection plane is placed on a subject, and the internal organ information on the projection plane matches the internal organ information of the subject.
According to a preferred embodiment of the present invention, matching a virtual 3D model with a subject to establish a corresponding coordinate relationship comprises: and matching the virtual 3D model with the detected object according to the characteristic points to establish a corresponding coordinate relation.
According to a second aspect, an embodiment of the present invention provides an ultrasound device system, including a projection device and an ultrasound device, the projection device including: the projection system comprises a memory and a processor, wherein the memory and the processor are mutually connected in a communication mode, computer instructions are stored in the memory, and the processor executes the computer instructions so as to execute the projection method.
According to a third aspect, an embodiment of the present invention provides a computer-readable storage medium, which stores computer instructions for causing the computer to execute the projection method described above.
According to the ultrasonic equipment system provided by the embodiment of the invention, the ultrasonic equipment is used for calculating and processing to obtain the projection image of the part to be scanned, and the projection image is projected to the part to be scanned of the target object through the projection device. Compared with the traditional ultrasonic scanning, the scanning method has the advantages that an operator can control the ultrasonic probe to scan the part to be scanned according to the projection image, the visibility is higher and the requirement on the professional performance of the operator is low compared with the traditional ultrasonic scanning, in the scanning process, the operator can determine the scanning range of the ultrasonic probe according to the projection image, the comprehensive scanning of the part to be scanned is guaranteed, and a more accurate foundation is laid for the subsequent medical diagnosis.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 is a schematic diagram of the components of an ultrasound scanning system according to an embodiment of the invention;
FIG. 2 is a schematic diagram of an RGB-D camera;
FIG. 3 is a flow chart of a projection method according to an embodiment of the invention;
FIG. 4 is a schematic structural diagram of the ultrasound hardware;
FIG. 5 is a schematic view of the image processing apparatus on one side of the human body;
fig. 6 is a schematic view of projecting an image to a human body.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
According to a first aspect, there is provided an ultrasound scanning method embodiment, it being noted that the steps illustrated in the flowchart of the figure may be performed in a computer system, such as a set of computer executable instructions, and that while a logical order is illustrated in the flowchart, in some cases the steps illustrated or described may be performed in an order different than here.
Referring to fig. 1, the present invention discloses an ultrasound scanning system, which includes: an ultrasound device 61 and an image processing apparatus 62. The image processing device 62 is connected to the ultrasound apparatus 61, and the image processing device 62 includes an image acquisition device 621 and a projection device 622.
It should be noted that the image capturing device 621 and the projecting device 622 may be integrated in the image processing device 62; two independent apparatuses may be used, and these may be collectively referred to as an image processing apparatus. The specific arrangement of the image capturing device 621 and the projecting device 622 is not limited, and may be set according to actual situations. In the following description of the present embodiment, the image capturing device 621 and the projecting device 622 are two independent devices as an example.
Specifically, the image acquisition device 621 is configured to acquire a real-time image of the target object and send the real-time image to the ultrasound apparatus 61, and the ultrasound apparatus 61 forms a projection image corresponding to a portion to be scanned of the target object based on the real-time image of the target object and sends the projection image to the projection device 622. The image capture device 621 may be an RGB-D (RGB-Depth) camera or other type of camera. As shown in fig. 2, the RGB-D camera is composed of a color camera 01 and a depth camera 02, and is configured to collect RGB (Red, Green, Blue, RGB for short) information and depth information of a target object to form a real-time image of the target object, and send the real-time image to the ultrasound device 61. The image capturing device 621 may also be a camera on a mobile phone or a tablet computer. When the device is used, a camera of a mobile phone or a tablet computer is aligned to a target object so as to obtain a contour map of the target object. And transmits the silhouette image data of the target object to the ultrasound device 61.
After receiving the projection image sent by the ultrasound device 61, the projection device 622 projects the projection image to the portion to be scanned of the target object, so that the ultrasound device 61 can perform ultrasound scanning on the portion to be scanned based on the projection image projected to the portion to be scanned of the target object. The projection device 622 may be a projector, and is communicatively connected to the ultrasound apparatus 61. The projector is used for projecting the projection image to a part to be scanned of a target object under the control of the ultrasonic device 61. The projection device 622 may also be a laser emitter that projects laser light onto a target object to direct a region to be scanned. In other embodiments, the projection may be a line, 3D or 4D (i.e., dynamic volume). The projected image realizes the coarse positioning of the scanned part. When the ultrasonic probe is placed at the projection position, the part to be inspected which needs to be scanned is found by moving the ultrasonic probe.
When the camera and the projector are used, the camera and the projector can track the position to be scanned in real time through angle rotation, focusing and other modes according to the difference of the required position to be scanned, and the center and the size (the height and the width of the external rectangular frame) of the required position to be scanned are acquired in real time. The ultrasonic device 61 calculates the optimal field of view, and converts the optimal field of view into the rotation angle of the RGB-D camera according to the deviation (horizontal direction and vertical direction) of the target center point from the center point of the optimal field of view, and controls the RGB-D camera to focus according to the deviation ratio of the target size to the current size, thereby ensuring the sharpness and accuracy of the obtained image.
After the projection device 622 projects the projection image to the portion to be scanned of the target object, it may be that the operator controls the movement of the ultrasound probe based on the projection image projected to the portion to be scanned of the target object, so as to perform the ultrasound scanning on the portion to be scanned of the target object. The ultrasonic device 61 may acquire the current position of the ultrasonic probe, use the current position of the ultrasonic probe as a starting point, use a projection image projected to the position to be scanned of the target object as an end point, and determine the movement trajectory of the ultrasonic probe, and after determining the movement trajectory, the operator may manually adjust the movement of the ultrasonic probe, or adjust the movement of the ultrasonic probe through the moving mechanism, or the like. The moving mode of the ultrasonic probe is not limited at all, and may be set according to actual conditions.
In some optional embodiments of this embodiment, the acquisition of the portion to be scanned of the target object may be input by an operator through a human-computer interface provided by the ultrasound device 61; or, the ultrasound scanning system further includes an external input device 63 connected to the ultrasound device 61, and the operator inputs the portion to be scanned of the target object through the external input device 63 and sends the acquired portion to be scanned to the ultrasound device 61. The external input device 63 may be an electronic device having an input function, such as a touch screen, a computer, or the like. Of course, the external input device 63 may also be another type of device, which is not limited herein, and only needs to ensure that the ultrasound device 61 can acquire the portion to be scanned of the target object.
In other optional embodiments of this embodiment, the external input device 63 may further obtain personal information of the target object, for example, name, age, gender, and the like. The external input device 63 transmits personal information to the ultrasound device 61 after acquiring the personal information of the target object, so that the ultrasound device 61 can know the personal information of the target object. For example, the ultrasound device 61 may also combine the personal information to form an accurate projection image in forming a projection image based on a real-time image of the target object. The details will be described later.
The embodiment of the invention also provides another ultrasonic scanning system which comprises an ultrasonic device, an image processing device and a projection image generating device. Wherein, the image processing device is connected with the projection image generating device.
The image processing device includes an image acquisition device and a projection device, and details of the image acquisition device and the projection device are described above and are not repeated herein. In the working process of the ultrasonic scanning system, the image acquisition device acquires a real-time image of a target object and sends the acquired real-time image to the projection image generation device, the projection image generation device is used for executing ultrasonic scanning, forming a projection image and sending the projection image to the projection device, and the projection device projects the projection image to a part to be scanned of the target object. After the projection image is projected at the part to be scanned, the ultrasonic equipment can conduct ultrasonic scanning on the part to be scanned of the target object by taking the projection image as a guide.
In the ultrasonic scanning system provided by the embodiment of the invention, the image acquisition device acquires a real-time image of the target object, the ultrasonic equipment performs calculation and processing to obtain a projection image of the part to be scanned, and the projection image is projected to the part to be scanned of the target object through the projection device. Compared with the traditional ultrasonic scanning, the scanning method has the advantages that an operator can control the ultrasonic probe to scan the part to be scanned according to the projection image, the visibility is higher and the requirement on the professional performance of the operator is low compared with the traditional ultrasonic scanning, in the scanning process, the operator can determine the scanning range of the ultrasonic probe according to the projection image, the comprehensive scanning of the part to be scanned is guaranteed, and a more accurate foundation is laid for the subsequent medical diagnosis.
Referring to fig. 3, in the present embodiment, there is provided a projection method for ultrasound guidance, including:
step S101, acquiring a contour map of the object, and searching a virtual 3D model matched with the object in a database according to the input information of the object and the contour map, wherein the virtual 3D model contains the internal structure information of the virtual object, and the internal structure information comprises position information and/or shape information. The internal structural information may be structural information of the internal organ and tissues around the internal organ. A real-time image of an object (target object) is acquired by a camera, and a contour map of the object is obtained from the real-time image. In another embodiment, since the positions of the subject, the camera, and the projection device are fixed, one image of the subject may be acquired by the camera, and the contour map of the subject may be obtained from the image. In a specific application, a mobile phone, a tablet computer or other equipment with a camera is placed on one side of a subject, and an image of the subject is acquired through the camera.
Here, the real-time image may be a color RGB image, which may be photographed by an RGB-D camera in real time, a grayscale image, which may be photographed by a camera in real time, or the like; the RGB image may be converted into the grayscale image by an image processing technique, and the real-time image may include all images of the target object or only a part of the images. When the partial image is included, the partial image is an image corresponding to the part to be scanned for the purpose of improving accuracy, and for example, when the part to be scanned is a chest, the real-time image includes at least an upper body; when the part to be scanned is a carotid artery, the real-time image at least comprises images of shoulders and the area above. The region to be scanned may simultaneously comprise at least one region of the target object, such as the heart, lungs, liver, etc.
The information of the subject includes, but is not limited to, sex, age, height, weight, and body fat. The ultrasound device 61 can read personal information such as a two-dimensional code, a medical record card, a social security card, a delivery record of a delivery doctor, etc. of the target object through the external input device 63. A virtual 3D model matching the subject can be selected in the database based on the personal information and the contour map of the subject. The virtual 3D model is a virtual human 3D model, and the virtual 3D model includes a number of organ models, such as a heart, a bladder, or other organs. In the 3D model, the optimal solution is to include the position information and shape information of the organ, so that the desired organ can be found quickly, conveniently and intuitively, but the disadvantage is that a large storage space is occupied. In order to save memory space, only the position information of the organ may be stored in the 3D model.
And step S102, matching the virtual 3D model with the detected object according to the characteristic points to establish a corresponding coordinate relation. Matching the virtual 3D model with the subject may be matching the virtual 3D model with the subject according to the feature points to establish a corresponding coordinate relationship. Specifically, the feature points may be extracted from the contour map of the subject, and then matched with the feature points on the virtual 3D model established in advance to realize the correspondence between the virtual 3D model and the subject, so that the correspondence between the corresponding point locations on the subject and the point locations of the virtual 3D model, that is, the coordinate relationship between the corresponding point locations and the point locations of the virtual 3D model, may be realized.
The feature point may be any point on the contour map of the subject, and a point that is in relative position with any point on the contour map of the subject may exist on the virtual 3D model, and any point on the contour map of the subject and a point that is in relative position with any point on the contour map of the subject on the virtual 3D model may be referred to as a feature point. At least three feature points are selected on the contour map of the object, correspondingly, at least three feature points with the same relative positions as those on the contour map of the object are also selected on the virtual 3D model, and the virtual 3D model is matched with the contour map of the object according to the feature points to establish a coordinate relation.
Specifically, an arbitrary point on the contour map of the subject is selected, and the arbitrary point has a certain coordinate position in the contour map image. If the camera for capturing the subject image is a normal two-dimensional camera, the coordinate position of any point in the profile image is also two-dimensional coordinates, and if the camera for capturing the subject image is a three-dimensional camera, the coordinate position of any point in the profile image is also three-dimensional coordinates. Accordingly, a point on the virtual 3D model which is aligned with any point on the contour map of the subject similarly has a certain two-dimensional or three-dimensional coordinate position on the virtual 3D model. At least three or more characteristic points with the same relative position on the contour map of the detected body and the virtual 3D model are selected to calibrate the coordinate position, and the coordinate is calculated and converted according to the calibration result, so that the coordinate relation between the virtual 3D model and the contour map of the detected body is obtained.
Further, since the contour map of the subject is acquired by the camera in real time, the coordinate relationship between the virtual 3D model and the contour map of the subject, that is, the coordinate relationship between the virtual 3D model and the subject is obtained. The coordinate relationship may be used to assist the ultrasound probe in determining the position, for example, when the ultrasound probe detects on the object, the position of the ultrasound probe also corresponds to the corresponding position on the virtual 3D model, so as to assist the ultrasound probe in determining the position.
Step S103, at least a part of the internal structure of the virtual 3D model is projected according to the internal structure information of the virtual 3D model and the coordinate relation between the virtual 3D model and the detected object, and a projection result for guiding the detection position of the real ultrasonic probe is formed. The projection result may be a whole virtual 3D model covered by the camera shooting, or a virtual 3D model of a certain organ or a local position covered by the camera shooting, and the projection result may further include position information and shape information corresponding to the virtual 3D model, and all the projection results may be used to guide the user to perform the detection position of the ultrasound probe.
According to the embodiment of the present invention, a coordinate relationship between the virtual 3D model and the subject is established by selecting an arbitrary point on the contour map of the subject, and performing coordinate position conversion based on the coordinate position of each feature point, in which the feature point is also present on the virtual 3D model and the point whose relative position matches the arbitrary point on the contour map of the subject is present as the feature point. Utilize virtual 3D model to come the assistance user to carry out ultrasonic probe's detection position and confirm, throw through the inner structure information with the 3D model that the subject matches for feedback to the user, thereby convenience of customers observes and reduces current ultrasonic equipment and to operating personnel's professional requirement, improves efficiency and the accuracy that uses ultrasonic equipment to the position detection of the position of wanting to survey.
In an embodiment of the present invention, the step S103 may include projecting and displaying position information and/or shape information of at least a part of an internal structure of the virtual 3D model according to the internal structure information of the virtual 3D model and a coordinate relationship of the virtual 3D model to the subject. The projection may be projected onto the subject, or may be projected onto other locations, such as a wall or other screen, without limitation.
In another alternative embodiment, the step S103 may include projecting position information of at least a part of the internal structure of the virtual 3D model, which is a spot formed by laser projection, onto the subject according to the internal structure information of the virtual 3D model and the coordinate relationship between the virtual 3D model and the subject.
Specifically, the position information may be a dot formed by laser projection. In practical applications, the projection device (in this case, laser projection) and the positional relationship between the camera and the subject can be calculated, and the virtual 3D model and the subject have a corresponding coordinate relationship, so that the projection device can project the position of the organ to be scanned on the subject based on the above information, and in this case, can be a single laser spot. And (3) placing a real ultrasonic probe at the laser point to obtain a section view of the organ to be scanned. If the section is not the section at the required standard scanning section, the real ultrasonic probe is moved, rotated or inclined in an empirical mode or an artificial intelligence guidance mode to obtain the section at the standard scanning section.
In yet another alternative embodiment, the step S103 may include projecting at least a part of the internal structure of the virtual 3D model to form a projection plane with the internal organ information according to the internal structure information of the virtual 3D model and the coordinate relationship between the virtual 3D model and the subject.
In other embodiments, the shape information may be a surface formed by projection by a projector. A facet has more organ information than a point. When the real ultrasonic probe is guided to search for the standard scanning section, the required standard scanning section can be quickly found through the projected plane. In order to improve the scanning efficiency, a plurality of standard scanning tangent plane information can be set on the plane projected by the projector. And setting a standard scanning section corresponding to different scanning purposes. In order to avoid mutual interference between the standard scanning sections, the first standard scanning section is firstly displayed during display. And moving the real ultrasonic probe to a first standard scanning tangent plane to obtain a cross-sectional diagram. If the section is not the section at the required standard scanning section, the real ultrasonic probe is moved, rotated or inclined in an empirical mode or an artificial intelligence guidance mode to obtain the section at the standard scanning section. After the required section diagram is obtained, information is sent to the ultrasonic equipment, so that the first standard scanning section is hidden, and the second standard scanning section is displayed. And moving the real ultrasonic probe to a second standard scanning section for scanning, and repeating the steps until scanning is finished.
And the virtual 3D model is matched with the detected object in a projection transformation mode. Since there is a possibility that the virtual 3D model and the real object may not coincide in structure, a 3D model closer thereto is selected based on the contour map of the object and information of the object. And enabling the virtual 3D model to be closer to the real structure of the object in a projection transformation mode.
When the real ultrasonic probe is placed on a detected body, the position of the real ultrasonic probe on the detected body is obtained, the real ultrasonic probe is virtualized and displayed on the virtual 3D model, and the real ultrasonic probe is guided to move to a target position through real-time display of the real ultrasonic probe on the virtual 3D model.
Preferably, the display of the real ultrasound probe on the virtual 3D model comprises position information and pose information. Guiding the real ultrasound probe to move to the target position through the real-time display of the real ultrasound probe on the virtual 3D model is performed by: a virtualized ultrasound probe is displayed at a target location on the virtual 3D model. The virtualized ultrasound probe is a virtual probe selected in a database that matches a real ultrasound probe. When the real ultrasonic probe is moved, the movement to the target position is judged by coinciding the image of the real ultrasonic probe on the projection drawing with the virtualized ultrasonic probe at the target position. The mode can make an inexperienced operator quickly and accurately move the real ultrasonic probe to the target position. The virtual 3D model and the virtualized ultrasound probe can be displayed on a cell phone screen, a tablet screen, or other screens, as well as on other displays. For a more intuitive and faster finding of the target position, the projection of the ultrasound probe at the target position may also be displayed on the projection map.
The system comprises a plurality of target positions, wherein each target position is provided with a virtualized ultrasonic probe corresponding to the target position, and the virtualized ultrasonic probes at the target positions are sequentially displayed according to a scanning sequence.
Initially, the virtualized ultrasound probe at the target position is in a non-display state, and a profile of the subject is acquired by the camera. When the real ultrasonic probe is placed within the field of view of the camera, the virtualized ultrasonic probe at the first target location is displayed; when the real ultrasonic probe coincides with the virtualized ultrasonic probe at the first target position, the pose of the real ultrasonic probe is adjusted to obtain a required image, and at the moment, information is sent to the ultrasonic equipment, so that the virtualized ultrasonic probe at the second target position is displayed, and the virtualized ultrasonic probe at the first target position is hidden. And the rest is done in sequence until the scanning is finished.
To facilitate distinguishing the order of the virtualized ultrasound probes at the respective target locations, different color designations are used for the virtualized ultrasound probes at different target locations. For example, a virtualized ultrasound probe at a first target location is shown in red, a virtualized ultrasound probe at a second target location is shown in orange, a virtualized ultrasound probe at a third target location is shown in yellow, an ultrasound probe at a fourth target location is shown in green, and so on. The operator can easily recognize that this is the virtualized ultrasound probe at the fourth target location by the color of the virtualized ultrasound probe.
The above is a prompt made by color, and in other embodiments, a sound may be used as the prompt. For example, "at first target location" is played while the virtualized ultrasound probe at the first target location is displayed. The "at second target location" is played while the virtualized ultrasound probe at the second target location is displayed. Other prompts may be used in addition to sound. For example, the text "first" or other similar expression is added next to the virtualized ultrasound probe at a first target, and the text "second" or other similar expression is added next to the virtualized ultrasound probe at a second target. And so on.
The above-described projection method for guidance is used for ultrasound scanning, but can also be used in other medical fields.
As shown in fig. 4, the apparatus may include: at least one processor 51, such as a CPU (Central Processing Unit), at least one communication interface 53, memory 54, at least one communication bus 52. Wherein a communication bus 52 is used to enable the connection communication between these components. The communication interface 53 may include a Display (Display) and a Keyboard (Keyboard), and the optional communication interface 53 may also include a standard wired interface and a standard wireless interface. The Memory 54 may be a high-speed RAM Memory (volatile Random Access Memory) or a non-volatile Memory (non-volatile Memory), such as at least one disk Memory. The memory 54 may alternatively be at least one memory device located remotely from the processor 51. Wherein the processor 51 may be in connection with the apparatus described in fig. 4, the memory 54 stores an application program, and the processor 51 calls the program code stored in the memory 54 for performing any of the above-mentioned method steps.
The communication bus 52 may be a Peripheral Component Interconnect (PCI) bus or an Extended Industry Standard Architecture (EISA) bus. The communication bus 52 may be divided into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown in FIG. 4, but this does not indicate only one bus or one type of bus.
The memory 54 may include a volatile memory (RAM), such as a random-access memory (RAM); the memory may also include a non-volatile memory (english: non-volatile memory), such as a flash memory (english: flash memory), a hard disk (english: hard disk drive, abbreviated: HDD) or a solid-state drive (english: SSD); the memory 54 may also comprise a combination of the above types of memories.
The processor 51 may be a Central Processing Unit (CPU), a Network Processor (NP), or a combination of a CPU and an NP.
The processor 51 may further include a hardware chip. The hardware chip may be an application-specific integrated circuit (ASIC), a Programmable Logic Device (PLD), or a combination thereof. The PLD may be a Complex Programmable Logic Device (CPLD), a field-programmable gate array (FPGA), a General Array Logic (GAL), or any combination thereof.
Optionally, the memory 54 is also used to store program instructions. The processor 51 may invoke program instructions to implement the ultrasound scanning method as shown in the embodiments of the first aspect of the present application.
According to another aspect, embodiments of the present invention provide a non-transitory computer storage medium having stored thereon computer-executable instructions that may perform an ultrasound scanning method in any of the above-described method embodiments. The storage medium may be a magnetic Disk, an optical Disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a Flash Memory (Flash Memory), a Hard Disk (Hard Disk Drive, abbreviated as HDD), a Solid State Drive (SSD), or the like; the storage medium may also comprise a combination of memories of the kind described above.
Fig. 5 and 6 show an embodiment of the present invention, in which a camera acquires a contour map of a subject and selects a phantom matching the contour map from a database based on the contour map and personal information. By suitable stretching and/or scaling, the phantom is projected to the region of the human body to be scanned. For example, if the part to be scanned is a kidney, it needs to be projected onto the human body. In other embodiments, it may be projected onto a wall or other location. And according to the position of the kidney, placing the real ultrasonic probe at the kidney for scanning. In order to obtain more accurate positioning, standard scanning sections to be scanned are automatically generated in the data according to the items to be inspected. In order to avoid the observability influence caused by the fact that all the scanning sections are displayed, the scanning sections can be selected to be sequentially displayed.
According to a first aspect, embodiments of the present invention provide a projection method for medical imaging, comprising:
acquiring a contour map of a subject, and searching a virtual 3D model matched with the subject in a database according to input information of the subject and the contour map, wherein the virtual 3D model contains internal structure information of the virtual subject, and the internal structure information comprises position information and/or shape information. The internal structural information may include visual information of a reaction organ (inside) such as a color and a texture in order to clearly display the internal structural information of the subject.
And matching the virtual 3D model with the object according to the characteristic points to establish a corresponding coordinate relation, and projecting and displaying position information and/or shape information of at least a part of the internal structure of the virtual 3D model according to the internal structure information of the virtual 3D model and the coordinate relation between the virtual 3D model and the object.
According to a preferred embodiment of the invention, the position information is a spot formed by laser projection.
According to a preferred embodiment of the invention, the spot comprises a central portion and a rim portion surrounding the central portion, the central portion and the rim portion being of different colors. In practical applications, for example, the color of the central portion may be set to red, and the color of the side portions may be set to blue. The center part indicates the center position of the organ, and the side parts indicate the peripheral position of the organ. The ultrasonic probe can be correspondingly arranged at the central part or the edge part according to the part of the organ to be detected.
According to a preferred embodiment of the present invention, a contour map of the subject is acquired by the camera, and if the distance between the projection device and the subject is a set distance, the spot is in a non-blinking state; if the distance between the projection device and the subject is not set, the spot is in a flickering state. Different set distances can be preset for different subjects, so that better positioning accuracy can be obtained. When the projection is a plane, the projection plane can be in a non-flashing state by setting the distance between the projection device and the detected body to be a set distance; if the distance between the projection device and the subject is not a set distance, the projection surface is in a flickering state. The projection image at the set distance is in the optimal state, so that a user can conveniently and quickly find the optimal placement position of the projection device.
According to a preferred embodiment of the present invention, the shape information is a surface formed by projection by a projector.
According to a preferred embodiment of the present invention, the virtual 3D model is matched to the subject by means of a projective transformation.
According to a preferred embodiment of the present invention, an organ to be scanned of a subject is input, whether or not a virtual 3D model contains shape information of the organ to be scanned is determined, if so, the shape information of the organ to be scanned in the virtual 3D model is projected onto the subject, and if not, an anatomical map of an internal organ contained in the virtual 3D model and position information of the organ to be scanned with respect to the anatomical map of the internal organ in the virtual 3D model are inquired, and the position information is projected onto the subject.
According to a preferred embodiment of the present invention, the medical imaging is ultrasound imaging, when a real ultrasound probe is placed in a subject, the position of the real ultrasound probe on the subject is acquired, the real ultrasound probe is virtualized and displayed on the virtual 3D model, and the real ultrasound probe is displayed in real time and guided to move to a target position by the position relationship of the virtualized ultrasound probe on the virtual 3D model and the scanned organ. The display may be displayed on an interface, such as a cell phone interface or a tablet interface.
According to a preferred embodiment of the present invention, the virtualized ultrasound probe is acquired by an ultrasound image obtained by a real ultrasound probe, a sensor and/or a camera located on the real ultrasound probe. For example, the position of the real ultrasound probe in the subject can be known from the ultrasound image of the real ultrasound probe, and a virtualized ultrasound probe can be generated on the virtual 3D model. In other embodiments, the position of the real ultrasound probe in the subject may also be acquired by a sensor or a camera on the ultrasound probe. Preferably, the position of the real ultrasonic probe on the detected body can be obtained by fusing the ultrasonic image, the sensor and the camera.
According to a preferred embodiment of the invention, the display of the real ultrasound probe on the virtual 3D model comprises position information and pose information.
According to a preferred embodiment of the present invention, guiding the real ultrasound probe to move to the target position by real-time display of the real ultrasound probe on the virtual 3D model is performed by: and displaying the virtualized ultrasonic probe at the target position on the virtual 3D model, selecting the virtualized ultrasonic probe matched with the real ultrasonic probe according to the real ultrasonic probe, displaying the virtualized ultrasonic probe at the target position, and judging to move to the target position by overlapping the image of the real ultrasonic probe on the projection drawing with the virtualized ultrasonic probe at the target position when the real ultrasonic probe is moved. Whether the target organ is accurately moved can be judged through two ways, namely whether the virtual probe corresponding to the real probe is moved to a specific organ on the 3D model; the other is whether the ultrasound image obtained by ultrasound in real time corresponds to an ultrasound image of the organ.
According to a preferred embodiment of the present invention, the target positions are plural, a virtualized ultrasound probe corresponding to the target position is provided at each target position, and the virtualized ultrasound probes at the target positions are sequentially displayed according to a scanning sequence.
According to a preferred embodiment of the present invention, the virtualized ultrasonic probe at the target position is in a non-display state, a profile of the subject is acquired by the camera, and the virtualized ultrasonic probe at the first target position is displayed when the real ultrasonic probe is placed within the field of view of the camera; when the real ultrasonic probe coincides with the virtualized ultrasonic probe at the first target position, the pose of the real ultrasonic probe is adjusted to obtain a required image, at this time, the virtualized ultrasonic probe at the second target is displayed, and the virtualized ultrasonic probe at the first target is hidden.
According to another aspect, an embodiment of the present invention provides a computer-readable storage medium, wherein the computer-readable storage medium stores computer instructions for causing the computer to execute the projection method described above.
Although the embodiments of the present invention have been described in conjunction with the accompanying drawings, those skilled in the art may make various modifications and variations without departing from the spirit and scope of the invention, and such modifications and variations fall within the scope defined by the appended claims.
Claims (20)
1. A projection method for medical imaging, comprising:
acquiring a contour map of a subject, and searching a virtual 3D model matched with the subject in a database according to input information of the subject and the contour map, wherein the virtual 3D model contains internal structure information of the virtual subject;
matching the virtual 3D model with the detected object to establish a corresponding coordinate relation;
and projecting at least a part of the internal structure of the virtual 3D model according to the internal structure information of the virtual 3D model and the coordinate relation between the virtual 3D model and the detected object to form a projection result for guiding the detection position of the real ultrasonic probe.
2. The projection method of claim 1, wherein the internal structure information comprises: position information and/or shape information, wherein the projecting at least a part of the internal structure of the virtual 3D model according to the internal structure information of the virtual 3D model and the coordinate relationship of the virtual 3D model to the subject, comprises:
position information and/or shape information of at least a part of an internal structure of the virtual 3D model is projected and displayed according to the internal structure information of the virtual 3D model and the coordinate relationship of the virtual 3D model and the subject.
3. The projection method according to claim 1 or 2, wherein the internal structure information includes: position information, which is a spot formed by laser projection.
4. The projection method according to claim 3, wherein the projecting at least a part of the internal structure of the virtual 3D model according to the internal structure information of the virtual 3D model and the coordinate relationship of the virtual 3D model to the subject comprises:
position information of at least a part of an internal structure of the virtual 3D model is projected to the subject based on the internal structure information of the virtual 3D model and a coordinate relationship of the virtual 3D model and the subject.
5. A projection method according to claim 3 or 4, wherein the spot comprises a central portion and an edge portion surrounding the central portion, the central portion and the edge portion being of different colours.
6. The projection method according to claim 3, 4 or 5, wherein the contour map of the object is acquired by the camera, and if the distance between the projection device and the object is a set distance, the spot is in a non-flicker state; if the distance between the projection device and the subject is not set, the spot is in a flickering state.
7. The projection method according to claim 1, wherein the projection result is a projection plane with internal organ information, and the projecting at least a part of the internal structure of the virtual 3D model based on the internal structure information of the virtual 3D model and the coordinate relationship of the virtual 3D model to the subject includes:
at least a part of the internal structure of the virtual 3D model is projected to form a projection plane with internal organ information according to the internal structure information of the virtual 3D model and the coordinate relation between the virtual 3D model and the detected object.
8. The projection method according to claim 2, wherein the shape information is a surface formed by projection by a projector.
9. The projection method according to claim 1 or 2 or 4 or 7, characterized in that the virtual 3D model is matched to the object by means of a projective transformation.
10. The projection method according to claim 2, 4 or 7, wherein an organ to be scanned of the subject is inputted, it is determined whether or not the virtual 3D model contains shape information of the organ to be scanned, if so, the shape information of the organ to be scanned in the virtual 3D model is projected onto the subject, and if not, an anatomical map of an internal organ contained in the virtual 3D model and position information of the organ to be scanned with respect to the internal organ map in the virtual 3D model are inquired, and the position information is projected onto the subject.
11. The projection method according to claim 1, 2, 4 or 7, wherein the medical imaging is ultrasonic imaging, when a real ultrasonic probe is placed in a subject, the position of the real ultrasonic probe on the subject is acquired, the real ultrasonic probe is virtualized and displayed on the virtual 3D model, and the real ultrasonic probe is displayed and guided to move to a target position in real time through the position relationship of the virtualized ultrasonic probe on the virtual 3D model and the scanned organ.
12. The projection method according to claim 11, wherein the virtualized ultrasound probe is acquired by an ultrasound image obtained by a real ultrasound probe, a sensor and/or a camera located on the real ultrasound probe.
13. The projection method according to claim 11, wherein the display of the real ultrasound probe on the virtual 3D model comprises position information and pose information.
14. The projection method according to claim 11, wherein guiding the real ultrasound probe to move to the target position by real-time display of the real ultrasound probe on the virtual 3D model is performed by: and displaying the virtualized ultrasonic probe at the target position on the virtual 3D model, selecting the virtualized ultrasonic probe matched with the real ultrasonic probe according to the real ultrasonic probe, displaying the virtualized ultrasonic probe at the target position, and judging to move to the target position by overlapping the image of the real ultrasonic probe on the projection drawing with the virtualized ultrasonic probe at the target position when the real ultrasonic probe is moved.
15. The projection method according to claim 14, wherein the number of the target positions is plural, a virtualized ultrasound probe corresponding to a target position is provided at each target position, and the virtualized ultrasound probes at the target positions are sequentially displayed according to a scanning sequence.
16. The projection method according to claim 15, wherein the virtualized ultrasound probe at the target position is in a non-display state, a profile of the subject is acquired by the camera, and the virtualized ultrasound probe at the first target position is displayed when the real ultrasound probe is placed within a visual field of the camera; when the real ultrasonic probe coincides with the virtualized ultrasonic probe at the first target position, the pose of the real ultrasonic probe is adjusted to obtain a required image, at this time, the virtualized ultrasonic probe at the second target is displayed, and the virtualized ultrasonic probe at the first target is hidden.
17. The projection method according to claim 7, wherein the projection plane is placed on a subject, and the internal organ information on the projection plane matches with internal organ information of the subject.
18. The projection method according to claim 1 or 2 or 4 or 7, wherein matching the virtual 3D model with the object to establish a corresponding coordinate relationship comprises:
and matching the virtual 3D model with the detected object according to the characteristic points to establish a corresponding coordinate relation.
19. An ultrasound device system comprising a projection device and an ultrasound device, the projection device comprising:
a memory and a processor, the memory and the processor being communicatively coupled to each other, the memory having stored therein computer instructions, the processor executing the computer instructions to perform the projection method of any of claims 1-18.
20. A computer-readable storage medium storing computer instructions for causing a computer to perform the projection method of any one of claims 1-18.
Applications Claiming Priority (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111203632 | 2021-10-15 | ||
CN202111202491 | 2021-10-15 | ||
CN2021112024914 | 2021-10-15 | ||
CN202111202477 | 2021-10-15 | ||
CN2021112036324 | 2021-10-15 | ||
CN2021112024774 | 2021-10-15 |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113876356A true CN113876356A (en) | 2022-01-04 |
Family
ID=79017706
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111358184.5A Pending CN113876356A (en) | 2021-10-15 | 2021-11-16 | Projection method for medical imaging, ultrasonic equipment system and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113876356A (en) |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150173707A1 (en) * | 2013-12-20 | 2015-06-25 | Kabushiki Kaisha Toshiba | Image processing apparatus, ultrasound diagnosis apparatus, and image processing method |
CN105934215A (en) * | 2014-01-24 | 2016-09-07 | 皇家飞利浦有限公司 | Robotic control of imaging devices with optical shape sensing |
CN106108951A (en) * | 2016-07-26 | 2016-11-16 | 上海市第人民医院 | A kind of medical real-time three-dimensional location tracking system and method |
CN107854142A (en) * | 2017-11-28 | 2018-03-30 | 无锡祥生医疗科技股份有限公司 | Medical supersonic augmented reality imaging system |
TWI634870B (en) * | 2017-03-20 | 2018-09-11 | 承鋆生醫股份有限公司 | Image registration and augmented reality system and method augmented reality thereof |
CN109770943A (en) * | 2019-01-28 | 2019-05-21 | 电子科技大学 | A kind of ultrasonic automatic optimization method positioned using computer vision |
CN111657997A (en) * | 2020-06-23 | 2020-09-15 | 无锡祥生医疗科技股份有限公司 | Ultrasonic auxiliary guiding method, device and storage medium |
CN111938700A (en) * | 2020-08-21 | 2020-11-17 | 电子科技大学 | Ultrasonic probe guiding system and method based on real-time matching of human anatomy structure |
CN112057107A (en) * | 2020-09-14 | 2020-12-11 | 无锡祥生医疗科技股份有限公司 | Ultrasonic scanning method, ultrasonic equipment and system |
-
2021
- 2021-11-16 CN CN202111358184.5A patent/CN113876356A/en active Pending
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150173707A1 (en) * | 2013-12-20 | 2015-06-25 | Kabushiki Kaisha Toshiba | Image processing apparatus, ultrasound diagnosis apparatus, and image processing method |
CN105934215A (en) * | 2014-01-24 | 2016-09-07 | 皇家飞利浦有限公司 | Robotic control of imaging devices with optical shape sensing |
CN106108951A (en) * | 2016-07-26 | 2016-11-16 | 上海市第人民医院 | A kind of medical real-time three-dimensional location tracking system and method |
TWI634870B (en) * | 2017-03-20 | 2018-09-11 | 承鋆生醫股份有限公司 | Image registration and augmented reality system and method augmented reality thereof |
CN107854142A (en) * | 2017-11-28 | 2018-03-30 | 无锡祥生医疗科技股份有限公司 | Medical supersonic augmented reality imaging system |
CN109770943A (en) * | 2019-01-28 | 2019-05-21 | 电子科技大学 | A kind of ultrasonic automatic optimization method positioned using computer vision |
CN111657997A (en) * | 2020-06-23 | 2020-09-15 | 无锡祥生医疗科技股份有限公司 | Ultrasonic auxiliary guiding method, device and storage medium |
CN111938700A (en) * | 2020-08-21 | 2020-11-17 | 电子科技大学 | Ultrasonic probe guiding system and method based on real-time matching of human anatomy structure |
CN112057107A (en) * | 2020-09-14 | 2020-12-11 | 无锡祥生医疗科技股份有限公司 | Ultrasonic scanning method, ultrasonic equipment and system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11576645B2 (en) | Systems and methods for scanning a patient in an imaging system | |
US11576578B2 (en) | Systems and methods for scanning a patient in an imaging system | |
CN112057107A (en) | Ultrasonic scanning method, ultrasonic equipment and system | |
CN112022201A (en) | Machine guided imaging techniques | |
CN111629670B (en) | Echo window artifact classification and visual indicator for ultrasound systems | |
JP7362354B2 (en) | Information processing device, inspection system and information processing method | |
AU2012231802A1 (en) | Apparatus and method for determining a skin inflammation value | |
JP2016154858A (en) | Ultrasonic diagnostic device, medical image processing device and medical image processing program | |
CN112767309B (en) | Ultrasonic scanning method, ultrasonic device, system and storage medium | |
CN112807025A (en) | Ultrasonic scanning guiding method, device, system, computer equipment and storage medium | |
CN110428417A (en) | Property method of discrimination, storage medium and the Vltrasonic device of carotid plaques | |
JP7321836B2 (en) | Information processing device, inspection system and information processing method | |
KR20150068162A (en) | Apparatus for integration of three dimentional ultrasound images and method thereof | |
JP2023022123A (en) | System for providing determination of guidance signal and guidance for hand held ultrasonic transducer | |
CN113456018B (en) | Eye image data processing | |
CN109934798A (en) | Internal object information labeling method and device, electronic equipment, storage medium | |
CN113876356A (en) | Projection method for medical imaging, ultrasonic equipment system and storage medium | |
EP4005492A1 (en) | Guided acquisition of a 3d representation of an anatomical structure | |
CN114631841A (en) | Ultrasonic scanning feedback device | |
US10049480B2 (en) | Image alignment device, method, and program | |
CN111403007A (en) | Ultrasonic imaging optimization method, ultrasonic imaging system and computer-readable storage medium | |
CN115998324A (en) | Positioning method for medical imaging and medical imaging system | |
CN115990032B (en) | Priori knowledge-based ultrasonic scanning visual navigation method, apparatus and device | |
KR102598211B1 (en) | Ultrasound scanner for measuring urine volume in a bladder | |
CN112991166B (en) | Intelligent auxiliary guiding method, ultrasonic equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20220104 |