CN114693864A - Ultrasonic auxiliary imaging method and device based on matching model network and storage medium - Google Patents

Ultrasonic auxiliary imaging method and device based on matching model network and storage medium Download PDF

Info

Publication number
CN114693864A
CN114693864A CN202011607595.9A CN202011607595A CN114693864A CN 114693864 A CN114693864 A CN 114693864A CN 202011607595 A CN202011607595 A CN 202011607595A CN 114693864 A CN114693864 A CN 114693864A
Authority
CN
China
Prior art keywords
dimensional
model
organ
ultrasound
ultrasonic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011607595.9A
Other languages
Chinese (zh)
Inventor
甘从贵
贾廷秀
陈建军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuxi Chudian Technology Co ltd
Original Assignee
Wuxi Chudian Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuxi Chudian Technology Co ltd filed Critical Wuxi Chudian Technology Co ltd
Priority to CN202011607595.9A priority Critical patent/CN114693864A/en
Publication of CN114693864A publication Critical patent/CN114693864A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Mathematical Physics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The invention provides an ultrasonic auxiliary imaging method based on a matching model network, which comprises the following steps: acquiring a three-dimensional ultrasonic image of a detected object; acquiring a three-dimensional CT image or a three-dimensional MRI image of a detection object; processing the input three-dimensional ultrasonic image through the trained ultrasonic three-dimensional convolution neural network to obtain an organ three-dimensional ultrasonic model; processing the input three-dimensional CT or MRI image through the trained CT or MRI three-dimensional convolution neural network to obtain an organ three-dimensional CT or MRI model; inputting the obtained organ three-dimensional ultrasonic model and the organ three-dimensional CT model into a trained three-dimensional atlas convolution neural network for processing to obtain a three-dimensional transformation matrix, or inputting the obtained organ three-dimensional ultrasonic model and the obtained organ three-dimensional MRI model into the trained three-dimensional atlas convolution neural network for processing to obtain a three-dimensional transformation matrix; and processing the organ three-dimensional ultrasonic model and the three-dimensional transformation matrix to obtain the matched organ three-dimensional ultrasonic model.

Description

Ultrasonic auxiliary imaging method and device based on matching model network and storage medium
Technical Field
The invention relates to the technical field of ultrasonic medical detection, in particular to an ultrasonic auxiliary imaging method and device based on a matching model network and a storage medium.
Background
With the continuous development of medical diagnostic equipment, the ultrasonic imaging instrument becomes one of the most widely used diagnostic tools in clinical practice due to its advantages of non-invasiveness, real-time performance, convenient operation, low price, etc.
However, the traditional ultrasound scanning process is affected by factors such as user manipulation and emotion, and the obtained ultrasound image is not standard, which is not beneficial to the auxiliary diagnosis of doctors.
Disclosure of Invention
In order to solve the defects in the related art, the invention provides an ultrasonic auxiliary imaging method, an ultrasonic auxiliary imaging device and a storage medium based on a matching model network, which can effectively and normally carry out ultrasonic auxiliary imaging, and are convenient for doctors to synchronously look up images in different modes and carry out image comparison in different modes.
According to the technical scheme provided by the invention, as a first aspect of the invention, an ultrasonic auxiliary imaging method based on a matching model network is provided, which comprises the following steps: acquiring a three-dimensional ultrasonic image of a to-be-detected part of a detected object;
acquiring a three-dimensional CT image or a three-dimensional MRI image of a part to be detected corresponding to a detection object;
processing the input three-dimensional ultrasonic image through the trained ultrasonic three-dimensional convolution neural network to obtain an organ three-dimensional ultrasonic model;
processing the input three-dimensional CT or MRI image through the trained CT or MRI three-dimensional convolution neural network to obtain an organ three-dimensional CT or MRI model;
inputting the obtained organ three-dimensional ultrasonic model and the organ three-dimensional CT model into a trained three-dimensional atlas convolution neural network for processing to obtain a three-dimensional transformation matrix, or inputting the obtained organ three-dimensional ultrasonic model and the obtained organ three-dimensional MRI model into the trained three-dimensional atlas convolution neural network for processing to obtain a three-dimensional transformation matrix;
and processing the organ three-dimensional ultrasonic model and the three-dimensional transformation matrix to obtain the matched organ three-dimensional ultrasonic model.
In some embodiments, the matched three-dimensional ultrasound image of the organ is displayed using the matched three-dimensional ultrasound model of the organ.
In some embodiments, the trained ultrasonic three-dimensional convolutional neural network comprises: the device comprises an input layer, a plurality of convolution layers, a plurality of pooling layers, a plurality of linear interpolation layers and an output layer.
In some embodiments, the trained ultrasonic three-dimensional convolutional neural network further comprises: and connecting the layers in a cross-layer manner.
In some embodiments, the convolution kernel size of the plurality of convolution layers is 3 x 3.
In some embodiments, the trained CT or MRI three-dimensional convolutional neural network comprises: the device comprises an input layer, a plurality of convolution layers, a plurality of pooling layers, a plurality of linear interpolation layers and an output layer.
In some embodiments, the trained CT or MRI three-dimensional convolutional neural network further comprises: and connecting the layers in a cross-layer manner.
In some embodiments, the trained three-dimensional graph convolutional neural network comprises: an input layer, a plurality of graph convolution layers, a plurality of full connection layers, and an output layer.
The matching model network-based ultrasonic auxiliary imaging device comprises a memory and a processor, wherein the memory and the processor are connected through a bus, and the bus is provided with a corresponding communication interface. The memory stores computer instructions, and the processor executes the computer instructions to perform the matching model network-based ultrasound-assisted imaging method according to the first aspect of the present invention.
As a third aspect of the present invention, a computer-readable storage medium is provided, which stores computer instructions for executing the matching model network based ultrasound-assisted imaging method according to the first aspect of the present invention.
From the above, compared with the prior art, the ultrasound-assisted imaging method, the ultrasound-assisted imaging device and the storage medium based on the matching model network provided by the invention have the following advantages: compared with the traditional three-dimensional ultrasonic image, the matched three-dimensional ultrasonic image of the organ is more standard, the probability of missing scanning is reduced, a doctor can conveniently look up images of different modalities simultaneously, and the matched three-dimensional ultrasonic image of the organ is easy to compare with a three-dimensional CT or MRI image.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a schematic structural diagram of an ultrasonic three-dimensional convolution neural network according to an embodiment of the present invention.
Fig. 2 is a schematic structural diagram of a CT or MRI three-dimensional convolutional neural network according to an embodiment of the present invention.
Fig. 3 is a schematic structural diagram of a three-dimensional graph convolutional neural network according to an embodiment of the present invention.
Fig. 4 is a schematic diagram of a matching model network structure according to an embodiment of the present invention.
Fig. 5 is a schematic diagram of an ultrasound-assisted imaging method and apparatus according to an embodiment of the present invention.
Detailed Description
The technical solutions in the present invention will be described clearly and completely with reference to the accompanying drawings, and it should be understood that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be obtained by a person skilled in the art without any inventive step based on the embodiments of the present invention, are within the scope of the present invention.
In the description of the present invention, it should be noted that the terms "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc., indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, and are only for convenience of description and simplicity of description, but do not indicate or imply that the device or element being referred to must have a particular orientation, be constructed and operated in a particular orientation, and thus, should not be construed as limiting the present invention. Furthermore, the terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
In the description of the present invention, it should be noted that, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; the connection can be mechanical connection or electrical connection; the two elements may be directly connected or indirectly connected through an intermediate medium, or may be communicated with each other inside the two elements, or may be wirelessly connected or wired connected. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art.
In addition, the technical features involved in the different embodiments of the present invention described below may be combined with each other as long as they do not conflict with each other.
For convenience of understanding and description, in describing the first aspect of the present invention, other subjects, such as a user, a doctor, and a detected object, are added to assist in describing the process of performing the ultrasound-assisted imaging method based on the matching model network.
In one embodiment of the invention, an ultrasound-assisted imaging method and apparatus based on a matching model network is provided, and the apparatus includes a three-dimensional ultrasound acquisition module 110, a processor 120, a display 130, and a three-dimensional CT or MRI image acquisition module 140. The three-dimensional ultrasound acquisition module 110 acquires three-dimensional ultrasound images of different organs including a breast, a liver, a kidney and the like, the three-dimensional CT or MRI image acquisition module 140 acquires three-dimensional CT or MRI images of different organs including a breast, a liver, a kidney and the like, the processor 120 processes the acquired three-dimensional ultrasound images and three-dimensional CT images or three-dimensional ultrasound images and three-dimensional MRI images, the three-dimensional ultrasound images and the three-dimensional MRI images are analyzed to obtain matched three-dimensional ultrasound images of organs, and the display 130 can display the acquired matched three-dimensional ultrasound images of organs for assisting diagnosis of doctors.
As shown in fig. 5, the three-dimensional ultrasound acquisition module 110 of the present embodiment is an ultrasound imaging device, that is, an ultrasound image or a video is acquired by the ultrasound imaging device. As shown in fig. 5, the ultrasound imaging apparatus includes at least a transducer 101, an ultrasound host 102, an input unit 103, a control unit 104, and a memory 105. The ultrasound imaging device may include a display screen (not shown), which may be the display 130. The transducer 101 is used for transmitting and receiving ultrasonic waves, the transducer 101 is excited by a transmission pulse, transmits the ultrasonic waves to a target tissue (for example, an organ, a tissue, a blood vessel, etc. in a human body or an animal body), receives an ultrasonic echo with information of the target tissue reflected from a target area after a certain time delay, and converts the ultrasonic echo back into an electric signal again to obtain an ultrasonic image or video. The transducer 101 may be connected to the ultrasound host 102 by wire or wirelessly.
The input unit 103 is used for inputting a control instruction of an operator. The input unit 103 may be at least one of a keyboard, a trackball, a mouse, a touch panel, a handle, a dial, a joystick, and a foot switch. The input unit may also input a non-contact type signal such as a sound, a gesture, a line of sight, or a brain wave signal.
The control unit 104 can control at least focus information, drive frequency information, drive voltage information, and scanning information such as an imaging mode. The control unit 104 performs different processing on the signals according to different imaging modes required by a user to obtain ultrasonic image data of different modes, and then performs processing such as logarithmic compression, dynamic range adjustment, digital scan conversion and the like to form ultrasonic images of different modes, such as a B image, a C image, a D image, a doppler blood flow image, an elastic image containing elastic properties of tissues and the like, or other types of two-dimensional ultrasonic images or three-dimensional ultrasonic images.
The display 130 is used for displaying information such as the matched three-dimensional ultrasound image of the organ, parameters or the type of peripheral nerves in the video, and dynamic information. Display 130 may be a touch screen display. Of course, the ultrasound diagnostic apparatus may also be connected to another display through a port of the input unit 103 to implement a dual-screen display system. In addition, the number of displays in this embodiment is not limited. The displayed image data (ultrasound image, MRI image, CT image) may be displayed on one display, or may be simultaneously displayed on a plurality of displays, or certainly, portions of the ultrasound image may be synchronously displayed on a plurality of displays, respectively, which is also not limited in this embodiment. In addition, the display 130 provides a graphical interface for human-computer interaction for a user while displaying images, one or more controlled objects are arranged on the graphical interface, and the user is provided with a human-computer interaction device to input operation instructions to control the controlled objects, so that corresponding control operation is performed. For example, projection and VR glasses, but the display may also include an input device, for example, a touch input display screen, and a projector VR glasses for sensing motion. Icons displayed on the display 130 may be manipulated using the human interaction device to perform particular functions.
In one embodiment, the three-dimensional ultrasound acquisition module 110 may be applause ultrasound, a transducer, a display of applause ultrasound, or the like integrated in a housing that can be held by an operator's hand.
A neural network model or unit herein includes (or contains or has) other elements as well as those elements. The term "module" as used herein means, but is not limited to, a software or hardware component, such as a Field Programmable Gate Array (FPGA) or Application Specific Integrated Circuit (ASIC), or a processor, e.g., CPU, GPU, performing certain tasks. A module may advantageously be configured to reside in the addressable storage medium and configured to execute on one or more processors. Thus, a module may include, by way of example, components (such as software components, object-oriented software components, class components, and task components), processes, functions, attributes, procedures, subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, and variables. The functionality provided for in the modules may be combined into fewer components and modules or further separated into additional components and modules.
As a first aspect of the present invention, as shown in fig. 4:
as a matching model network based ultrasound-assisted imaging method 200 is provided, which can be applied in an ultrasound device, the method 200 can include the following steps:
step 210: and acquiring a three-dimensional ultrasonic image of the part to be detected of the detected object, wherein the corresponding content of the ultrasonic image comprises parts such as mammary gland, liver, kidney and the like.
In some embodiments, the ultrasound image of the object to be detected may be acquired by an ultrasound device (e.g., a color ultrasound device, a black and white ultrasound device, a palm ultrasound device, etc.), a database (e.g., a PACS system), or the like.
Step 220: and acquiring a three-dimensional CT or MRI image of the detected object corresponding to the part to be detected, wherein the corresponding content of the three-dimensional CT or MRI image comprises parts such as mammary gland, liver, kidney and the like.
In some embodiments, the three-dimensional ultrasound image may be processed using a trained recognition neural network model to identify a target region, which may include breast, liver, kidney, etc. lesion. In some embodiments, the neural network model is obtained by training a training set of ultrasound images that mark focal regions of breast, liver, kidney, etc. The contents of the neural network model are not described in detail herein.
Step 230: and processing the input three-dimensional ultrasonic image through the trained ultrasonic three-dimensional convolution neural network to obtain an organ three-dimensional ultrasonic model.
In some embodiments, as shown in FIG. 1, the input three-dimensional ultrasound image is processed by the trained ultrasound three-dimensional convolutional neural network of FIG. 1. The ultrasonic three-dimensional convolution neural network structure comprises an input layer, a plurality of convolution layers, a plurality of pooling layers, a plurality of cross-layer connecting layers, a plurality of linear interpolation layers and an output layer; the input layer inputs a three-dimensional ultrasound image of size N64 x 64, which is composed of N two-dimensional ultrasound images of size 64 x 64. The convolution layer is used for extracting image characteristics of the input three-dimensional ultrasonic image, and the size of a convolution kernel is 3 × 3; the pooling layer is used for performing down-sampling processing on the extracted ultrasonic image characteristics to obtain low-resolution image characteristics; the linear interpolation layer is used for carrying out interpolation processing on the ultrasonic image characteristics with low resolution to restore the ultrasonic image characteristics with high resolution; the cross-layer connection layer is used for compensating ultrasonic image characteristics lost in the convolution, pooling and interpolation processes; the output layer is used for outputting the three-dimensional ultrasonic model of the organ.
Step 240: and processing the input three-dimensional CT or MRI image through the trained CT or MRI three-dimensional convolution neural network to obtain an organ three-dimensional CT or MRI model.
In some embodiments, as shown in FIG. 2, the input CT or MRI images are processed through the trained CT or MRI three-dimensional convolutional neural network of FIG. 2. The CT or MRI three-dimensional convolution neural network structure comprises an input layer, a plurality of convolution layers, a plurality of pooling layers, a plurality of cross-layer connecting layers, a plurality of linear interpolation layers and an output layer; the input layer inputs three-dimensional CT or MRI images with size M x 64, which are composed of M two-dimensional CT or MRI images with size 64 x 64. The convolution layer is used for extracting image characteristics of an input three-dimensional CT or MRI image, and the size of a convolution kernel is 3 x 3; the pooling layer is used for performing down-sampling processing on the extracted CT or MRI image features to obtain low-resolution CT or MRI image features; the linear interpolation layer is used for carrying out interpolation processing on the CT or MRI image characteristics with low resolution to restore the CT or MRI image characteristics with high resolution; the cross-layer connecting layer is used for compensating CT or MRI image characteristics lost in the convolution, pooling and interpolation processes; the output layer is used for outputting a three-dimensional CT or MRI model of the organ.
Step 250: inputting the obtained organ three-dimensional ultrasonic model and the organ three-dimensional CT model into a trained three-dimensional graph convolution neural network for processing to obtain a three-dimensional transformation matrix; or inputting the obtained organ three-dimensional ultrasonic model and the organ three-dimensional MRI model into the trained three-dimensional atlas convolution neural network for processing to obtain a three-dimensional transformation matrix. The three-dimensional transformation matrix may be a three-dimensional affine transformation matrix.
In some embodiments, as shown in FIG. 3, the input organ three-dimensional ultrasound model and organ three-dimensional CT model, or organ three-dimensional ultrasound model and organ three-dimensional MRI model are processed by the trained three-dimensional graph convolutional neural network of FIG. 3. The graph convolution neural network structure comprises an input layer, a plurality of graph convolution layers, a plurality of full connection layers and an output layer; inputting an organ ultrasonic model and an organ three-dimensional CT model or inputting an organ ultrasonic model and an organ three-dimensional MRI model into an input layer; the graph volume layer is used for extracting three-dimensional image distribution characteristics of the organ; the full connection layer is used for returning the transformation relation between the three-dimensional ultrasonic image characteristics and the three-dimensional CT image characteristics or returning the transformation relation between the three-dimensional ultrasonic image characteristics and the three-dimensional MRI image characteristics, and the output layer is used for outputting a transformation matrix.
Step 260: and the three-dimensional model transformation module multiplies the obtained organ three-dimensional ultrasonic model by a three-dimensional transformation matrix, and the organ three-dimensional ultrasonic model is rotated, zoomed, translated, sheared, reflected and the like through the three-dimensional transformation matrix to obtain the matched organ three-dimensional ultrasonic model. The matched three-dimensional ultrasonic model of the organ can be a three-dimensional voxel model, a depth map model and the like.
Step 270: and displaying the matched three-dimensional ultrasonic image of the organ through the matched three-dimensional ultrasonic model of the organ.
The doctor obtains a three-dimensional ultrasonic image and a three-dimensional CT image or a three-dimensional ultrasonic image and a three-dimensional MRI image of the same part to be detected of the detected object, and obtains a matched ultrasonic three-dimensional image through the matching model network model. At this time, the matched three-dimensional ultrasound image of the organ is aligned with the higher-resolution and more standard CT or MRI image of the corresponding organ, so that the matched three-dimensional ultrasound image of the organ is more standard, and can be used for checking whether the condition of missing scanning exists or not.
In one embodiment, the display can simultaneously display the matched three-dimensional ultrasound images and three-dimensional CT images of the organ, or simultaneously display the matched three-dimensional ultrasound images and three-dimensional MRI images of the organ, so that a doctor can conveniently and simultaneously check images of multiple modalities to obtain more comprehensive and accurate diagnosis information.
As a second aspect of the present invention:
the ultrasonic auxiliary imaging device based on the matching model network comprises a memory and a processor, wherein the memory and the processor are connected through a bus, and the bus is provided with a corresponding communication interface. The memory stores computer instructions, and the processor executes the computer instructions to perform the matching model network-based ultrasound-assisted imaging method according to the first aspect of the present invention.
As a third aspect of the present invention:
there is provided a computer readable storage medium having stored thereon computer instructions for performing the matching model network based ultrasound assisted imaging method according to the first aspect of the invention.
It should be understood that the above examples are only for clarity of illustration and are not intended to limit the embodiments. Other variations and modifications will be apparent to persons skilled in the art in light of the above description. And are neither required nor exhaustive of all embodiments. And obvious variations or modifications therefrom are within the scope of the invention.

Claims (10)

1. An ultrasonic auxiliary imaging method based on a matching model network is characterized by comprising the following steps:
acquiring a three-dimensional ultrasonic image of a to-be-detected part of a detected object;
acquiring a three-dimensional CT image or a three-dimensional MRI image of a part to be detected corresponding to a detected object;
processing the input three-dimensional ultrasonic image through the trained ultrasonic three-dimensional convolution neural network to obtain an organ three-dimensional ultrasonic model;
processing the input three-dimensional CT or MRI image through the trained CT or MRI three-dimensional convolution neural network to obtain an organ three-dimensional CT or MRI model;
inputting the obtained organ three-dimensional ultrasonic model and the organ three-dimensional CT model into a trained three-dimensional atlas convolution neural network for processing to obtain a three-dimensional transformation matrix, or inputting the obtained organ three-dimensional ultrasonic model and the obtained organ three-dimensional MRI model into the trained three-dimensional atlas convolution neural network for processing to obtain a three-dimensional transformation matrix;
and processing the organ three-dimensional ultrasonic model and the three-dimensional transformation matrix to obtain the matched organ three-dimensional ultrasonic model.
2. The matching model network based ultrasound assisted imaging method of claim 1, further comprising the steps of: and displaying the matched three-dimensional ultrasonic image of the organ through the matched three-dimensional ultrasonic model of the organ.
3. The matching model network based ultrasound-assisted imaging method of claim 1 or 2, wherein the trained ultrasound three-dimensional convolutional neural network comprises: the device comprises an input layer, a plurality of convolution layers, a plurality of pooling layers, a plurality of linear interpolation layers and an output layer.
4. A method of ultrasound-assisted imaging of a matching model network as claimed in claim 3, wherein the trained ultrasound three-dimensional convolutional neural network further comprises: and connecting the layers in a cross-layer manner.
5. An ultrasound-assisted imaging method according to claim 3 matched model networks, characterized in that the convolution kernel size of the plurality of convolution layers is 3 x 3.
6. The matching model network based ultrasound assisted imaging method of claim 1 or 2, wherein the trained CT or MRI three-dimensional convolutional neural network comprises: the device comprises an input layer, a plurality of convolution layers, a plurality of pooling layers, a plurality of linear interpolation layers and an output layer.
7. An ultrasound-assisted imaging method of matching model networks as claimed in claim 6 wherein the trained CT or MRI three-dimensional convolutional neural network further comprises: and connecting the layers in a cross-layer manner.
8. The matching model network based ultrasound assisted imaging method of claim 1 or 2, wherein the trained three-dimensional map convolutional neural network comprises: an input layer, a plurality of graph convolution layers, a plurality of full connection layers, and an output layer.
9. An ultrasound-assisted imaging device based on a matching model network, characterized in that the ultrasound-assisted imaging device based on the matching model network comprises a memory and a processor, the memory and the processor are connected through a bus, the memory stores computer instructions, and the processor executes the computer instructions so as to execute the ultrasound-assisted imaging method based on the matching model network according to any one of claims 1 to 8.
10. A computer-readable storage medium storing computer instructions for causing a computer to perform the matching model network based ultrasound-assisted imaging method of any one of claims 1-8.
CN202011607595.9A 2020-12-29 2020-12-29 Ultrasonic auxiliary imaging method and device based on matching model network and storage medium Pending CN114693864A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011607595.9A CN114693864A (en) 2020-12-29 2020-12-29 Ultrasonic auxiliary imaging method and device based on matching model network and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011607595.9A CN114693864A (en) 2020-12-29 2020-12-29 Ultrasonic auxiliary imaging method and device based on matching model network and storage medium

Publications (1)

Publication Number Publication Date
CN114693864A true CN114693864A (en) 2022-07-01

Family

ID=82132906

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011607595.9A Pending CN114693864A (en) 2020-12-29 2020-12-29 Ultrasonic auxiliary imaging method and device based on matching model network and storage medium

Country Status (1)

Country Link
CN (1) CN114693864A (en)

Similar Documents

Publication Publication Date Title
US8469890B2 (en) System and method for compensating for motion when displaying ultrasound motion tracking information
US11653897B2 (en) Ultrasonic diagnostic apparatus, scan support method, and medical image processing apparatus
US20120108960A1 (en) Method and system for organizing stored ultrasound data
US8471866B2 (en) User interface and method for identifying related information displayed in an ultrasound system
CN111184534B (en) Ultrasonic diagnostic apparatus for determining abnormalities of fetal heart and method of operating the same
CN112469340A (en) Ultrasound system with artificial neural network for guided liver imaging
US20100249591A1 (en) System and method for displaying ultrasound motion tracking information
US9390546B2 (en) Methods and systems for removing occlusions in 3D ultrasound images
KR102545008B1 (en) Ultrasound imaging apparatus and control method for the same
JP6956483B2 (en) Ultrasonic diagnostic equipment and scanning support program
US20200170615A1 (en) Ultrasound system with extraction of image planes from volume data using touch interaction with an image
US20170209125A1 (en) Diagnostic system and method for obtaining measurements from a medical image
US20210113191A1 (en) Image data adjustment method and device
US20170209118A1 (en) Ultrasonic imaging apparatus and method for controlling the same
JP2023549093A (en) Robust segmentation with high-level image understanding
US20110055148A1 (en) System and method for reducing ultrasound information storage requirements
JP5390149B2 (en) Ultrasonic diagnostic apparatus, ultrasonic diagnostic support program, and image processing apparatus
EP3520704B1 (en) Ultrasound diagnosis apparatus and method of controlling the same
US20190209122A1 (en) Ultrasound diagnosis apparatus and method of controlling the same
CN112137643A (en) Region of interest localization for longitudinal monitoring in quantitative ultrasound
US20240057970A1 (en) Ultrasound image acquisition, tracking and review
CN114693864A (en) Ultrasonic auxiliary imaging method and device based on matching model network and storage medium
US11974883B2 (en) Ultrasound imaging apparatus, method of controlling the same, and computer program
CN112990267B (en) Breast ultrasonic imaging method and device based on style migration model and storage medium
CN111292248B (en) Ultrasonic fusion imaging method and ultrasonic fusion navigation system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination