US12133764B2 - Systems and methods for an adaptive interface for an ultrasound imaging system - Google Patents
Systems and methods for an adaptive interface for an ultrasound imaging system Download PDFInfo
- Publication number
- US12133764B2 US12133764B2 US17/148,376 US202117148376A US12133764B2 US 12133764 B2 US12133764 B2 US 12133764B2 US 202117148376 A US202117148376 A US 202117148376A US 12133764 B2 US12133764 B2 US 12133764B2
- Authority
- US
- United States
- Prior art keywords
- proficiency
- user
- guidance
- score
- ultrasound
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
- 238000000034 method Methods 0.000 title claims abstract description 152
- 238000012285 ultrasound imaging Methods 0.000 title claims description 75
- 230000003044 adaptive effect Effects 0.000 title description 6
- 210000003484 anatomy Anatomy 0.000 claims abstract description 61
- 238000003384 imaging method Methods 0.000 claims abstract description 31
- 238000002059 diagnostic imaging Methods 0.000 claims abstract description 6
- 238000002604 ultrasonography Methods 0.000 claims description 255
- 239000000523 sample Substances 0.000 claims description 91
- 230000012010 growth Effects 0.000 claims description 17
- 230000001965 increasing effect Effects 0.000 claims description 17
- 238000012545 processing Methods 0.000 claims description 13
- 238000011156 evaluation Methods 0.000 claims description 10
- 238000012790 confirmation Methods 0.000 claims description 4
- 230000003247 decreasing effect Effects 0.000 claims description 4
- 238000010801 machine learning Methods 0.000 claims description 3
- 238000012544 monitoring process Methods 0.000 claims description 3
- 230000003213 activating effect Effects 0.000 claims description 2
- 210000004185 liver Anatomy 0.000 description 62
- 230000006870 function Effects 0.000 description 33
- 230000000007 visual effect Effects 0.000 description 27
- 210000000952 spleen Anatomy 0.000 description 13
- 230000008569 process Effects 0.000 description 12
- 210000000056 organ Anatomy 0.000 description 9
- 210000004291 uterus Anatomy 0.000 description 9
- 230000008901 benefit Effects 0.000 description 8
- 210000002216 heart Anatomy 0.000 description 7
- 230000004044 response Effects 0.000 description 7
- 210000001015 abdomen Anatomy 0.000 description 5
- 238000004891 communication Methods 0.000 description 5
- 238000002592 echocardiography Methods 0.000 description 5
- 230000006872 improvement Effects 0.000 description 5
- 230000000977 initiatory effect Effects 0.000 description 5
- 230000000670 limiting effect Effects 0.000 description 5
- 238000013528 artificial neural network Methods 0.000 description 4
- 238000006243 chemical reaction Methods 0.000 description 4
- 210000003734 kidney Anatomy 0.000 description 4
- 238000012549 training Methods 0.000 description 4
- 238000012546 transfer Methods 0.000 description 4
- 230000008859 change Effects 0.000 description 3
- 238000011161 development Methods 0.000 description 3
- 230000018109 developmental process Effects 0.000 description 3
- 210000004072 lung Anatomy 0.000 description 3
- 238000013507 mapping Methods 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000002829 reductive effect Effects 0.000 description 3
- 238000012800 visualization Methods 0.000 description 3
- 206010028980 Neoplasm Diseases 0.000 description 2
- 238000013459 approach Methods 0.000 description 2
- 239000013078 crystal Substances 0.000 description 2
- 230000001186 cumulative effect Effects 0.000 description 2
- 230000007423 decrease Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000002091 elastography Methods 0.000 description 2
- 210000003754 fetus Anatomy 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 238000009877 rendering Methods 0.000 description 2
- 230000003595 spectral effect Effects 0.000 description 2
- 230000036962 time dependent Effects 0.000 description 2
- 230000001052 transient effect Effects 0.000 description 2
- 230000005856 abnormality Effects 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000003698 anagen phase Effects 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 210000000601 blood cell Anatomy 0.000 description 1
- 230000003139 buffering effect Effects 0.000 description 1
- 230000000747 cardiac effect Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000002591 computed tomography Methods 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000003066 decision tree Methods 0.000 description 1
- 238000003745 diagnosis Methods 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 239000007789 gas Substances 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 230000001976 improved effect Effects 0.000 description 1
- 238000009607 mammography Methods 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 230000003387 muscular Effects 0.000 description 1
- 230000036961 partial effect Effects 0.000 description 1
- 230000007170 pathology Effects 0.000 description 1
- 238000012797 qualification Methods 0.000 description 1
- 239000012925 reference material Substances 0.000 description 1
- 238000004904 shortening Methods 0.000 description 1
- 230000002459 sustained effect Effects 0.000 description 1
- 210000001519 tissue Anatomy 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/44—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
- A61B8/4483—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device characterised by features of the ultrasound transducer
- A61B8/4488—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device characterised by features of the ultrasound transducer the transducer being a phased array
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
- A61B8/463—Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
- A61B8/465—Displaying means of special interest adapted to display user selection data, e.g. icons or menus
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
- A61B8/466—Displaying means of special interest adapted to display 3D data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/48—Diagnostic techniques
- A61B8/483—Diagnostic techniques involving the acquisition of a 3D volume of data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/48—Diagnostic techniques
- A61B8/485—Diagnostic techniques involving measuring strain or elastic properties
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/48—Diagnostic techniques
- A61B8/486—Diagnostic techniques involving arbitrary m-mode
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/48—Diagnostic techniques
- A61B8/488—Diagnostic techniques involving Doppler signals
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5207—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of raw data to produce diagnostic data, e.g. for generating an image
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5215—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B23/00—Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
- G09B23/28—Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine
- G09B23/286—Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine for scanning or photography techniques, e.g. X-rays, ultrasonics
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B5/00—Electrically-operated educational appliances
- G09B5/02—Electrically-operated educational appliances with visual presentation of the material to be studied, e.g. using film strip
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/63—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/42—Details of probe positioning or probe attachment to the patient
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/44—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
- A61B8/4427—Device being portable or laptop-like
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/44—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
- A61B8/4444—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to the probe
- A61B8/4472—Wireless probes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5292—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves using additional data, e.g. patient information, image labeling, acquisition parameters
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B19/00—Teaching not covered by other main groups of this subclass
- G09B19/003—Repetitive work cycles; Sequence of movements
Definitions
- Embodiments of the subject matter disclosed herein relate to ultrasound imaging, and more particularly, to user interfaces of ultrasound imaging systems.
- Medical ultrasound is an imaging modality that employs ultrasound waves to probe the internal structures of a body of a patient and produce a corresponding image.
- an ultrasound probe comprising a plurality of transducer elements emits ultrasonic pulses which reflect or echo, refract, or are absorbed by structures in the body. The ultrasound probe then receives reflected echoes, which are processed into an image.
- Adjusting the placement of the probe to acquire suitable images may be difficult.
- Acquiring suitable images involves adjustment of a location and orientation of the probe on a patient's body, pressure applied to the probe, and adjustment of one or more acquisition parameters (e.g., scan settings) such as transmit frequency, transmit depth, time gain compensation, etc.
- acquisition parameters e.g., scan settings
- the process of adjusting the probe may be complicated by variation in patient body types, anatomical features, pathologies, or other factors. As a result, acquiring suitable images involves considerable operator experience, developed as result of a time-consuming trial and error process.
- the current disclosure at least partially addresses one or more of the above identified issues by a method comprising: determining a proficiency score of a user, the proficiency score corresponding to a proficiency of the user in imaging with a medical imaging modality; based on the proficiency score, adjusting an amount of a guidance provided to the user on a graphical user interface of a display device coupled to the medical imaging modality; and displaying an indication of the proficiency on the graphical user interface.
- guidance provided to a user may be customized based on their proficiency in acquiring medical images. Further, as the proficiency of the user evolves with time and practice, the proficiency score may change and thus, the guidance may be adjusted taking into account the evolving proficiency of the user. As a result, user guidance may be automatically adapted to the proficiency of the user, which improves user experience and efficiency.
- FIG. 1 shows a block diagram of an exemplary embodiment of an ultrasound system
- FIGS. 2 A, 2 B, and 2 C are exemplary embodiments of a chart for assessing the proficiency of a user of an ultrasound imaging system
- FIGS. 3 A, 3 B, and 3 C are flowcharts illustrating example methods for displaying guidance to a user of an ultrasound imaging system based on a proficiency score of the user;
- FIGS. 4 A, 4 B, and 4 C are examples of contextual user guidance displays in a user interface for an ultrasound device
- FIGS. 5 A and 5 B are examples of real-time user guidance displays in a user interface for an ultrasound device
- FIGS. 6 A, 6 B, 6 C, and 6 D are exemplary embodiments of a user interface for an ultrasound device
- FIG. 7 A is a plot showing change in user proficiency on an ultrasound device over a practicing duration and including a common threshold for more than one anatomical region
- FIG. 7 B is a plot showing a change in user proficiency on an ultrasound device over a practicing duration and including different thresholds for different anatomical regions.
- the following description relates to methods and systems for adjusting user guidance during medical imaging with an imaging modality, such as an ultrasound imaging system, based on tracking a proficiency score of a user.
- an imaging modality such as an ultrasound imaging system
- the description herein is provided with respect to an ultrasound imaging system, although it should be appreciated the methods and systems may be adapted to any medical imaging modality, such as mammography, x-ray, computed tomography, MRI, etc., without departing from the scope of the disclosure.
- Medical ultrasound imaging typically includes the placement of an ultrasound probe of an ultrasound imaging system, including one or more transducer elements onto an imaging subject, such as a patient, at the location of a target anatomical feature (e.g., abdomen, chest, etc.). Images are acquired by the ultrasound probe and are displayed on a display device in real time or near real time (e.g., the images are displayed once the images are generated and without intentional delay). The operator of the ultrasound probe may view the images and adjust various acquisition parameters and/or the position of the ultrasound probe in order to obtain high-quality images of the target anatomical feature (e.g., the heart, the liver, the kidney, etc.).
- a target anatomical feature e.g., abdomen, chest, etc.
- the acquisition parameters that may be adjusted include transmit frequency, transmit depth, gain (e.g., time gain compensation), beam steering angle, beamforming strategy, and/or other parameters.
- gain e.g., time gain compensation
- beam steering angle e.g., beam steering angle
- beamforming strategy e.g., beamforming strategy
- gaining experience in ultrasound image acquisition is a time consuming, trial-and-error process involving sustained learning and practicing on the part of an ultrasound operator.
- the process of adjusting the placement of the ultrasound probe until an image is acquired that appears optimal to the operator may not be defined or repeatable/reproducible between exams, and experience gained on one anatomical feature may not transfer well to a different anatomical feature.
- experience of an obstetrician in acquiring images of fetuses in pregnant women may not transfer to acquiring images of abnormalities in a spleen of an older man.
- an operator who has gained expertise in one anatomical region may not be proficient in another anatomical region.
- proficiency is a factor in the duration of an ultrasound examination.
- An operator whose proficiency level is low may take longer to acquire suitable images (e.g., images above a desired quality) than an expert operator, which may result in poor patient satisfaction and/or an inability to produce suitable images within the allotted time period. Further, any loss in image quality due to the user's lack of proficiency may result in a poor diagnosis and/or poor clinical outcomes.
- One approach to addressing this issue is to provide user guidance, such as a graphical user guidance, to users during ultrasound examinations.
- user guidance may be displayed in conjunction with or superimposed upon images acquired in real time by the ultrasound probe.
- the inventors herein have recognized that enabling or disabling the graphical user guidance without taking into account the operator's proficiency level may lead to a poor user experience by the operator, and may not improve the operator's proficiency or help in improving a quality of images acquired.
- on-screen graphical user guidance that is helpful to novice users may be a distraction to expert users, while the type of guidance useful to an advanced user may be confusing to a novice user.
- novice operators may seek proficiency improvement through basic skill development, while expert operators may seek proficiency improvement through increased productivity (e.g., reduced time spent in an examination), or other similar factors.
- this problem may be addressed by an adaptive feature that dynamically takes into account an operator's proficiency at the time of the examination, and accordingly enables, disables, and/or selects the type of user guidance most appropriate for the operator.
- the display of guidance to a user may be customized during an examination based on a proficiency of the user, whereby a first type of guidance may be displayed to a novice user, a second type of guidance may be displayed to a more experienced user, a third type of guidance may be displayed to an expert user, and so forth.
- a novice user performing an examination of a liver of a subject with an ultrasound probe may be shown a reference image of a liver superimposed on an image acquired by the probe during the examination, or a visual indication (e.g., via arrows) of how to perform an adjustment of the probe to visualize a cross-section of the liver.
- a more experienced user performing an examination of a liver of a subject with an ultrasound probe may be shown an outline of a nodule on the liver that may be hard to see due to being partially obscured by bowel gasses.
- the proficiency of the user may be indicated by a proficiency score displayed on a screen of the ultrasound imaging system.
- One or more algorithms running on a processor of an ultrasound imaging system based on instructions stored in a transitory memory of the ultrasound imaging system, may determine how the proficiency score of the user may be determined or adjusted, and/or what type of guidance to display to the user.
- a plurality of proficiency scores may be stored in a database accessible by the ultrasound imaging system, each proficiency score of the plurality of proficiency scores reflecting a proficiency of the user on a specific target anatomical feature (e.g., an organ).
- a specific target anatomical feature e.g., an organ.
- guidance displayed to the user may be further customized according to a type of examination being performed, whereby a first type of guidance may be displayed to the user in an examination of an organ on which the user has little experience, and a second type of guidance may be displayed to the user in an examination of a different organ on which the user has more experience.
- one or more proficiency scores may be used to determine whether to increase an amount of user guidance, or to decrease an amount of user guidance, or to determine what type of user guidance to display to the user.
- the proficiency score may be determined and/or updated based on a performance of the user in accordance with a plurality of proficiency parameters, such as a quality of an acquired image, an accuracy of a scan plane, a speed of image acquisition, a reproducibility and/or repeatability of image acquisition, etc.
- the proficiency parameters may also include historical data, such as a number of examinations performed or hours of practice completed.
- An additional advantage of the systems and methods disclosed herein is that the guidance displayed to the user may also be initiated, stopped, increased, or decreased during an examination based on a performance of the user during the examination. For example, if a time taken by the user to acquire an image exceeds a threshold duration, the user may be prompted whether or not to display the guidance, or the guidance may be displayed automatically with or without notification to the user. Additionally, the initiating, stopping, increasing, or decreasing of the guidance may be carried out based on a monitoring of one or more of the proficiency parameters during an examination. For example, a speed of the user in acquiring an image may be monitored, and a type of guidance may be displayed responsive to a speed less than a threshold speed.
- the initiating, stopping, increasing, or decreasing of different types of guidance may be coordinated and/or displayed in an order or a sequence in accordance with a task or educational goal, such as being incorporated into a training program whereby novice ultrasound users are guided through a procedure, or whereby a performance on a task may be instructed in a standardized manner.
- FIG. 1 An example ultrasound system including an ultrasound probe, a display device, and a processor are shown in FIG. 1 .
- ultrasound images may be acquired and displayed on the display device.
- the images may be acquired by adjusting the position of the ultrasound probe on a patient's body.
- the ultrasound images may be compared with reference models that may determine if unsatisfactory (e.g., low) image quality is due to aspects of probe placement, so that guidance may be displayed to aid the user in adjusting the position of the ultrasound probe in order to improve image quality.
- One or more rubrics may be used to assess and/or score user proficiency on one or more anatomical regions on which the adaptive guidance system has been trained, as shown in FIG. 2 .
- a processor may programmatically determine when and how to display user guidance on a display device during operation of an ultrasound imaging system and track operator proficiency, via example methods illustrated in FIGS. 3 A, 3 B, and 3 C .
- the user guidance may include contextual guidance, as shown in FIGS. 4 A, 4 B, and 4 C , and/or real-time guidance cues, as shown in 5 A and 5 B.
- the user guidance may include superimposed images, text, video, and/or other visual display features, as shown in FIGS. 6 A, 6 B, 6 C, and 6 D .
- Use of the adaptive guidance system may result in increased proficiency scores and faster acquisition of proficiency over time, as shown in FIGS. 7 A and 7 B , which also depict threshold proficiencies.
- FIGS. 7 A and 7 B which also depict threshold proficiencies.
- the ultrasound imaging system 100 includes a transmit beamformer 101 and a transmitter 102 that drives elements (e.g., transducer elements) 104 within a transducer array, herein referred to as probe 106 , to emit pulsed ultrasonic signals (referred to herein as transmit pulses) into a body of a patient (not shown).
- the probe 106 may be a one-dimensional transducer array probe, or may be a two-dimensional matrix transducer array probe.
- the transducer elements 104 may be comprised of a piezoelectric material. When a voltage is applied to a piezoelectric crystal, the crystal physically expands and contracts, emitting an ultrasonic spherical wave. In this way, transducer elements 104 may convert electronic transmit signals into acoustic transmit beams.
- the pulsed ultrasonic signals are back-scattered from structures within an interior of the body, like blood cells or muscular tissue, to produce echoes that return to the elements 104 .
- the echoes are converted into electrical signals, or ultrasound data, by the elements 104 and the electrical signals are received by a receiver 108 .
- the electrical signals representing the received echoes are passed through a receive beamformer 110 that outputs ultrasound data.
- transducer element 104 may produce one or more ultrasonic pulses to form one or more transmit beams in accordance with the received echoes.
- the probe 106 may contain electronic circuitry to do all or part of the transmit beamforming and/or the receive beamforming.
- all or part of the transmit beamformer 101 , the transmitter 102 , the receiver 108 , and the receive beamformer 110 may be situated within the probe 106 .
- the terms “scan” or “scanning” may also be used in this disclosure to refer to acquiring data through the process of transmitting and receiving ultrasonic signals.
- a user interface 115 may be used to control operation of the ultrasound imaging system 100 .
- the user interface 115 may include one or more of the following: a rotary element, a mouse, a keyboard, a trackball, hard keys linked to specific actions, soft keys that may be configured to control different functions, and/or a graphical user interface displayed on a display device 118 .
- the ultrasound imaging system 100 includes a processor 116 to control the transmit beamformer 101 , the transmitter 102 , the receiver 108 , and the receive beamformer 110 .
- the processer 116 is in electronic communication (e.g., communicatively connected) with the probe 106 .
- electronic communication may be defined to include both wired and wireless communications.
- the processor 116 may control the probe 106 to acquire data according to instructions stored on a memory 120 .
- memory includes any non-transient computer readable medium in which programming instructions are stored.
- the term tangible computer readable medium is expressly defined to include any type of computer readable storage.
- the example methods and systems may be implemented using coded instruction (e.g., computer readable instructions) stored on a non-transient computer readable medium such as a flash memory, a read-only memory (ROM), a random-access memory (RAM), a cache, or any other storage media in which information is stored for any duration (e.g. for extended period time periods, permanently, brief instances, for temporarily buffering, and/or for caching of the information).
- a non-transient computer readable medium such as a flash memory, a read-only memory (ROM), a random-access memory (RAM), a cache, or any other storage media in which information is stored for any duration (e.g. for extended period time periods, permanently, brief instances, for temporarily buffering, and/or for caching of the information).
- Computer memory of computer readable storage mediums as referenced herein may include volatile and non-volatile or removable and non-removable media for a storage of electronic-formatted information such as computer readable program instructions or modules of computer readable program instructions, data
- the processor 116 controls which of the elements 104 are active and the shape of a beam emitted from the probe 106 .
- the processor 116 is also in electronic communication with the display device 118 , and the processor 116 may process the data (e.g., ultrasound data) into images for display on the display device 118 .
- the processor 116 may include a central processor (CPU), according to an embodiment.
- the processor 116 may include other electronic components capable of carrying out processing functions, such as a digital signal processor, a field-programmable gate array (FPGA), or a graphic board.
- the processor 116 may include multiple electronic components capable of carrying out processing functions.
- the processor 116 may include two or more electronic components selected from a list of electronic components including: a central processor, a digital signal processor, a field-programmable gate array, and a graphic board.
- the processor 116 may also include a complex demodulator (not shown) that demodulates the RF data and generates raw data. In another embodiment, the demodulation can be carried out earlier in the processing chain.
- the processor 116 may be adapted to perform one or more processing operations according to a plurality of selectable ultrasound modalities on the data.
- the data may be processed in real-time during a scanning session as the echo signals are received by receiver 108 and transmitted to processor 116 .
- the term “real-time” is defined to include a procedure that is performed without any intentional delay.
- an embodiment may acquire images at a real-time rate of 7-20 frames/sec.
- Some embodiments of the invention may include multiple processors (not shown) to handle the processing tasks that are handled by processor 116 according to the exemplary embodiment described hereinabove.
- a first processor may be utilized to demodulate and decimate the RF signal while a second processor may be used to further process the data, for example by augmenting the data as described further herein, prior to displaying an image. It should be appreciated that other embodiments may use a different arrangement of processors.
- the ultrasound imaging system 100 may continuously acquire data at a frame-rate of, for example, 10 Hz to 30 Hz (e.g., 10 to 30 frames per second). Images generated from the data may be refreshed at a similar frame-rate on display device 118 . Other embodiments may acquire and display data at different rates. For example, some embodiments may acquire data at a frame-rate of less than 10 Hz or greater than 30 Hz depending on the size of the frame and the intended application.
- a memory 120 is included for storing processed frames of acquired data. In an exemplary embodiment, the memory 120 is of sufficient capacity to store at least several seconds' worth of frames of ultrasound data. The frames of data are stored in a manner to facilitate retrieval thereof according to its order or time of acquisition.
- the memory 120 may comprise any known data storage medium.
- data may be processed in different mode-related modules by the processor 116 (e.g., B-mode, Color Doppler, M-mode, Color M-mode, spectral Doppler, Elastography, TVI, strain, strain rate, and the like) to form 2D or 3D data.
- the processor 116 e.g., B-mode, Color Doppler, M-mode, Color M-mode, spectral Doppler, Elastography, TVI, strain, strain rate, and the like
- one or more modules may generate B-mode, color Doppler, M-mode, color M-mode, spectral Doppler, Elastography, TVI, strain, strain rate, and combinations thereof, and the like.
- the one or more modules may process color Doppler data, which may include traditional color flow Doppler, power Doppler, HD flow, and the like.
- the image lines and/or frames are stored in memory and may include timing information indicating a time at which the image lines and/or frames were stored in memory.
- the modules may include, for example, a scan conversion module to perform scan conversion operations to convert the acquired images from beam space coordinates to display space coordinates.
- a video processor module may be provided that reads the acquired images from a memory and displays an image in real time while a procedure (e.g., ultrasound imaging) is being performed on a patient.
- the video processor module may include a separate image memory, and the ultrasound images may be written to the image memory in order to be read and displayed by display device 118 .
- one or more components of ultrasound imaging system 100 may be included in a portable, handheld ultrasound imaging device.
- display device 118 and user interface 115 may be integrated into an exterior surface of the handheld ultrasound imaging device, which may further contain processor 116 and memory 120 .
- Probe 106 may comprise a handheld probe in electronic communication with the handheld ultrasound imaging device to collect raw ultrasound data.
- Transmit beamformer 101 , transmitter 102 , receiver 108 , and receive beamformer 110 may be included in the same or different portions of the ultrasound imaging system 100 .
- transmit beamformer 101 , transmitter 102 , receiver 108 , and receive beamformer 110 may be included in the handheld ultrasound imaging device, the probe, and combinations thereof.
- a block of data comprising scan lines and their samples is generated.
- a process known as scan conversion is performed to transform the two-dimensional data block into a displayable bitmap image with additional scan information such as depths, angles of each scan line, and so on.
- an interpolation technique is applied to fill missing holes (i.e., pixels) in the resulting image. These missing pixels occur because each element of the two-dimensional block should typically cover many pixels in the resulting image.
- a bicubic interpolation is applied which leverages neighboring elements of the two-dimensional block. As a result, if the two-dimensional block is relatively small in comparison to the size of the bitmap image, the scan-converted image will include areas of poor or low resolution, especially for areas of greater depth.
- the ultrasound imaging system 100 may include a data processing unit 130 configured to progressively analyze an operator's proficiency in acquiring ultrasound images.
- the operator's proficiency may be assessed with respect to a plurality of proficiency parameters, anatomical regions and/or target anatomical features.
- the operator may be assigned proficiency scores based on the operator's proficiency in acquiring ultrasound images of a liver, or a spleen, or a uterus, or another part of the human anatomy, where a proficiency score assigned to the operator for acquiring images of a spleen may be different from a proficiency score assigned to the operator for acquiring images of a liver.
- the proficiency score may be a function (e.g., a summation, an average, a weighted average, etc.) of individual proficiency scores assigned to the operator in regard to individual proficiency parameters (e.g., image quality, acquisition speed, scan plane accuracy, etc.).
- a proficiency score for an anatomical region, or across a plurality of anatomical regions may be assigned to the operator as a function of the operator's proficiency scores assigned in regard to target anatomical features.
- the operator may be assigned a proficiency score for the anatomical region of the abdomen, where the score is an average of the operator's proficiency scores on a plurality of target anatomical features of an abdomen (e.g., a liver, a spleen, a uterus, etc.).
- the operator may also be assigned an overall proficiency score, where the overall proficiency score is a function of the operator's proficiency scores for a plurality of anatomical regions (e.g., abdomen, chest, etc.).
- a robust framework is provided for assessing the operator's proficiency at acquiring ultrasound images on various parts of the human anatomy and at various scales.
- Analyzing an operator's proficiency may include scoring the operator's performance based on a set of one or more qualitative and quantitative proficiency parameters in accordance with a predefined rubric for measuring proficiency, where the set of one or more qualitative and quantitative proficiency parameters (hereinafter “proficiency parameters”) correspond to various evaluation attributes for a selected target anatomical feature.
- the evaluation attributes may include one or more of image quality, speed of acquisition, accuracy of scan plane, visualization of intended anatomical feature including anomalies, cumulative hours of practice, number of exams, repeatability, and reproducibility.
- the predefined rubric comprising the set of proficiency parameters corresponding to the various evaluation attributes is referred to herein as a “pre-set”, where a pre-set corresponds to a specific target anatomical feature.
- the operator's proficiency in acquiring images of a uterus may be scored in accordance with a uterus pre-set, whereby the operator's proficiency score for the uterus pre-set is a function of the operator's individual proficiency scores on each one of the set of qualitative and quantitative parameters comprised by the uterus pre-set.
- the operator's proficiency in acquiring images of a liver may be scored in accordance with a liver pre-set, whereby the operator's proficiency score for the liver pre-set is a function of the operator's individual proficiency scores on each one of the set of qualitative and quantitative parameters comprised by the liver pre-set.
- Pre-sets are described in further detail below, in relation to an example pre-set illustrated in FIG. 2 .
- the data processing unit 130 may receive as input images generated by the ultrasound probe 106 , scan settings information received from the user interface 115 , and/or historical information relating to the operator's use of the ultrasound imaging system 100 .
- the historical information may be stored in a proficiency database 132 that is electronically coupled to the processor 116 .
- the proficiency database 132 may store the operator's current and/or previous proficiency scores on one or more pre-sets, the number of ultrasound examinations the operator has carried out for each pre-set, the amount of time spent by the operator performing examinations within and across pre-sets, etc.
- the output of the data processing unit 130 may be a new proficiency score and/or an updated proficiency score (e.g.
- the proficiency score may be used to determine whether to display user guidance and/or specify what type of user guidance may be displayed.
- the evaluation attributes corresponding to the proficiency parameters represent criteria used for proficiency assessments.
- the evaluation attributes may include an accuracy of a scan plane.
- scan plane accuracy may be assessed based on a comparison of the images being acquired with one or more reference images by an AI algorithm.
- the metrics for scan plane accuracy may take into account if a “standard cross sectional view” of an organ was achieved that is widely used in practice to perform visualization, measurements and reporting such as a long axis view, a short axis view or a 4 chamber view of the whole heart, or a cross section of the whole kidney through a longitudinal plane or a transverse plane etc., which may determine the accuracy of measurements and quality of visualization.
- the AI algorithm in conjunction with probe marker orientation information may be used to detect any erroneous Left/Right inversion of the image displayed]].
- the AI algorithm may provide an assessment of scan plane accuracy by determining whether a number of anatomical features are present in the image, and/or whether the anatomical features in an image exceed a threshold visibility (e.g., resolution.).
- the AI algorithm may also evaluate one or more anatomical feature parameters in the image including shape of the anatomical features in determining the scan plane accuracy
- the AI algorithm may provide an assessment of scan plane accuracy by identifying anatomical features in an image and comparing distance ratios between the anatomical features with distance ratios between the same anatomical features of a reference image, or by comparing the relative scale of anatomical features in an image with the scale of same anatomical features of a reference image. It should be appreciated that the examples provided herein are for illustrative purposes, and other criteria and/or algorithms may be used by the data processing unit 130 for proficiency assessment without departing from the scope of this disclosure.
- the evaluation attributes may include image quality.
- the quality of an image may include evaluating criteria such as image contrast, contrast-to-noise ratio, spatial resolution, grayscale mapping, etc. Additionally or alternatively, image quality may be evaluated based on comparison with one or more reference images in a corresponding scan plane.
- other image quality attributes include adequacy of acoustic coupling (with gel) and contact pressure at the probe tip, any image artifacts due to air pockets, foreign objects, shadow of bony structure or motion artifacts in moving organs (as in case of the heart, fetus, lungs etc.) or artifacts due to bowel gases that define the imaging outcomes and in turn the user's proficiency. In this way opportunities for false-positive or false-negative reporting as a result of poor image acquisition may be reduced.
- the evaluation attributes may include speed of acquisition.
- the data processing unit may compare a duration of time taken by the operator in acquiring an image of a desired quality with a pre-defined reference duration indicative of a proficiency level, and assign the operator a proficiency score as a function of the difference between the operator's acquisition speed and the reference acquisition speed.
- the evaluation attributes may include accumulated information, such as cumulative hours of practice, number of exams performed on a given pre-set, an assessment of repeatability and/or reproducibility, etc.
- repeatability and reproducibility may be assessed by determining a relative positioning of anatomical features in an image, and comparing a first standard deviation between the relative positioning of the anatomical features in an acquired image with a second standard deviation between the relative positioning of the anatomical features in images previously acquired by the operator under the same conditions (e.g., repeatable during the same examination, with the same subject, etc.) and under different conditions (e.g., reproducible from a different examination, with a different subject, etc.), respectively.
- the pre-defined rubric may progressively yield updated proficiency scores of an operator, which may be used to dynamically determine whether user guidance for a specified pre-set may or may not be displayed to the operator on the display device 118 during workflow. For example, the system may determine whether a current proficiency score of the operator for various pre-sets is lower than a threshold proficiency value in order to selectively enable displaying user guidance.
- User guidance for a specified pre-set may also be displayed if the operator is taking an amount of time to obtain a desired quality image that exceeds a threshold duration. For example, if an operator is attempting to acquire ultrasound images of a heart and does not achieve a target scan plane (e.g., a scan plane in which a specific set of anatomical features are visible) within a pre-defined threshold duration for a heart pre-set, the ultrasound system 100 may display one or more forms of user guidance on the display device 118 to aid the operator in achieving the target scan plane.
- a target scan plane e.g., a scan plane in which a specific set of anatomical features are visible
- a duration for waiting prior to displaying the graphical user guidance may be predefined (e.g., programmed into instructions in a memory such as memory 120 ) or adjusted dynamically based on one or more criteria. For example, a waiting period prior to displaying user guidance may be predefined at 60 seconds by default, and/or the duration may be reduced to 30 seconds if the operator is assigned a proficiency score below a given threshold proficiency score. There may also be an additional manual override to control the ON/OFF state of the user guidance such that an operator may opt to hide the user guidance, for example, if the operator determines that the user guidance is not helpful.
- the user guidance may be contextual guidance in the form of reference images and/or real-time guidance, in the form of cues, textual instructions or display elements that indicate how a position, orientation, or pressure of the ultrasound probe 106 may be adjusted to acquire images of a desired quality.
- the contextual guidance may include a reference ultrasound image, which may be a high-quality ultrasound image pre-acquired by an experienced sonologist or radiologist from an earlier patient.
- the reference ultrasound image may be acquired from a person being currently examined based on an AI algorithm running in real-time.
- an AI algorithm may determine that one of the images acquired is in the target scan plane and display that image on the display device 118 as a reference image, or alternatively display a “target acquired” indication.
- contextual guidance may include an anatomical illustration, pre-recorded training video, or a video of a remote assistant connected via a video conference, and real-time guidance may include display elements generated by the remote assistant connected through a video call.
- the guidance cues may include prompts, check-lists of protocol steps, and/or any other kind of graphical display elements.
- the user guidance may be displayed in conjunction with one or more peripheral devices such as, for example, an external camera to support eye-tracking capability.
- peripheral devices such as, for example, an external camera to support eye-tracking capability.
- customized guidance may be generated that includes the display of graphical, visual, or textual elements, concurrent with the visual images being acquired, for operators with varied capabilities and/or different levels of experience, such that guidance may be provided in an adaptive manner in accordance with the operator's progressively evolving proficiency without relying on a consciously managed configuration selected by the operator.
- This addresses the problem of matching guidance to a proficiency level, meaning, and the problem of directing guidance to operators of different proficiency levels such that expert operators are not distracted by user guidance that is not helpful, while novice users are not deprived of helpful guidance.
- a further advantage of the ultrasound system disclosed herein is that it does not rely on additional hardware, and the costs associated with enabling the disclosed features are minimal, thereby enhancing techno-commercial feasibility and increasing the adoption of ultrasound imaging among non-traditional users including urologists, gynaecologists, and/or other clinicians.
- Pre-set chart view 200 of FIG. 2 A shows general rubric for determining a proficiency score of a user for any anatomical region or structure.
- Pre-set chart view 230 of FIG. 2 B shows an example rubric for determining a proficiency score of a user for a liver pre-set.
- Pre-set chart view 260 of FIG. 2 C shows example weightings, level scores, user scores, and proficiency score of a user for a liver pre-set.
- the ultrasound imaging system may be the same as, or similar to, the ultrasound imaging system 100 of FIG. 1 .
- pre-set chart views 200 , 230 , and 260 illustrate the features of a pre-set and an underlying rubric for assessing proficiency
- the data shown in each of the pre-set chart views 200 , 230 , and 260 may not be stored in a single table format, and may be stored in a relational database in accordance with a different data structure and/or across various database tables in relation to an operator of the ultrasound imaging system.
- the relational database may be the same as or similar to the proficiency database 132 of the ultrasound imaging system 100 of FIG. 1 .
- pre-set chart view 200 is shown, which illustrates a rubric or framework for determining a proficiency score of a user of the ultrasound imaging system.
- the pre-set chart 200 may be categorized according to an anatomical region, as indicated by the anatomical region label 202 .
- the pre-set chart 200 may relate to ultrasound examinations performed in the anatomical region of the abdomen, chest, etc.
- the pre-set chart 200 may further relate to a specific anatomical target, as indicated by the anatomical target label 204 .
- the pre-set chart 200 may relate to ultrasound examinations performed on a liver, spleen, heart, etc.
- the pre-set chart 200 may include one or more proficiency parameters 206 , shown in the left-hand column of the pre-set chart 200 in FIG. 2 A , which may represent different components into which a proficiency of the operator of the ultrasound system may be individually assessed.
- the proficiency parameters 206 may include image suitability criteria, such as image quality and/or scan plane accuracy.
- the proficiency parameters 206 may include a speed of the operator in acquiring ultrasound images of a desirable quality.
- the proficiency parameters 206 may also include historical information, such as a number of practice hours completed, a number of ultrasound exams completed, and/or an assessment of the reproducibility and/or repeatability of an ultrasound procedure by the operator. For example, a high reproducibility score may be assigned to an operator who performs a plurality of ultrasound exams, where a standard deviation between level scores associated with parameters such as image quality, scan plane accuracy, and acquisition speed for each ultrasound exam are below a threshold value.
- a user score 210 may be generated that reflects the proficiency of the user on the proficiency parameter.
- the user score 210 may be a summation, for a corresponding proficiency parameter 206 , of one or more level scores 208 , where each of the one or more level scores 208 assesses the proficiency of the operator within the relevant level.
- an operator may be assigned a level score 208 for each of level 1, level 2, and level 3 in relation to a given proficiency parameter 206 .
- the level scores may be summed to generate a user score 210 for the proficiency parameter 206 .
- the user score 210 may be assigned as a different function of the operator's level scores 208 .
- the user score 210 may be an average of the operator's level scores 208 . Further, the determination of the user score 210 may be based on a weighting of the level scores 208 , where a summation, average, etc. of the level scores are multiplied by a weight 212 associated with the relevant proficiency parameter.
- the proficiency parameter image quality may have a weight 212 of 0.9 associated with it, where an average of the level scores are multiplied by 0.9 to generate a weighted user score 210 for the proficiency parameter image quality.
- the weight 212 associated with each proficiency parameter 206 may be different, whereby the weight 212 may reflect a relative importance of the proficiency parameter 206 in relation to other proficiency parameters 206 .
- a level score for level 1 may be weighted more than a level score for level 2, or a level score for level 3 may be weighted more than a level score for level 2 for the proficiency parameter of image quality, while a level score for level 3 may be weighted less than a level score for level 2 for the proficiency parameter of reproducibility, etc.
- the user scores 210 may be further combined to generate a proficiency score 216 that reflects a proficiency of the operator on a preset.
- a proficiency score 216 that reflects a proficiency of the operator on a preset.
- an operator may be assigned a first proficiency score for a liver preset that reflects the proficiency of the operator in acquiring images of a liver, and the operator may be assigned a second proficiency score for a kidney preset that reflects the proficiency of the operator in acquiring images of a kidney.
- a plurality of proficiency scores may be generated for the operator that reflect an overall proficiency of the operator across a plurality of anatomical regions and a plurality of anatomical targets.
- an overall proficiency score may be determined as a function of the plurality of proficiency scores generated for the operator.
- the proficiency score 216 may be a number between 0 and 1 (e.g., 0.9). In other examples, the proficiency score 216 may be a number between 0 and 100 (e.g., 85), or the proficiency score 216 may be a number that is unbounded.
- the level scores 208 may be based on quantitative and/or qualitative criteria, depending on the parameter.
- a level score 208 for acquisition speed may be based on quantitative criteria (e.g., a duration) and not on qualitative criteria.
- a level score 208 for image quality may be based on either qualitative and quantitative criteria, or a combination of qualitative and quantitative criteria.
- Qualitative criteria for image quality may include, for example, whether an entire structure was imaged, whether the user has frozen on an optimal image of a sequence of images, a suitability of a scan plane for viewing an organ, whether any features are missing from an acquired image, and so forth.
- Quantitative criteria for image quality may include, for example, how many nodules, tumor sites, etc.
- each of the level scores 208 may provide an assessment of the operator's proficiency in acquiring ultrasound images of a different level of difficulty. Under each level and for each proficiency parameter 206 , a qualitative and/or quantitative criteria may be specified as to what is expected of the operator to get to a particular level for a given anatomical region and anatomical structure within the anatomical region. Based on whether or not the criteria are satisfied, the user score 210 may be computed. In one example, the level scores 208 are binary, reflecting either a passing score for a level (e.g., 1.0) or a failing score for a level (e.g., a 0.0).
- an operator may receive a level score 208 of 1.0 for level 1 for the proficiency parameter image quality, indicating that the operator has met one or more image quality criteria associated with a level 1 operator.
- the operator may receive a level score 208 of 0.0 for level 2 for the proficiency parameter image quality, indicating that the operator has not met one or more image quality criteria associated with a level 2 operator.
- a user score 210 may then be generated for the operator, where the user score is the result of multiplying the level score of 1.0 for level 1 by a weight 212 associated with the proficiency parameter image quality.
- An example for an anatomical target e.g., a liver is described below in reference to FIGS. 2 B and 2 C .
- the level score 208 associated with the proficiency parameter 206 may be a different type of score than a type of score used for other proficiency parameters 206 (e.g., qualitative proficiency parameters).
- a level score for the proficiency parameter of number of practice hours may be the actual number of hours completed
- a level score for the proficiency parameter of image quality may be a score from 1 to 10, where 1 is the lowest score and 10 is the highest score.
- the user score 210 may not be the same type of score as a level score for the same proficiency parameter 206 .
- a user score for the proficiency parameter of practice hours may be a score generated as a function of one or more level scores indicating a number of hours completed. For example, if the user scores for each proficiency parameter 206 are values between 1 and 10, with 10 representing high proficiency and 1 representing low proficiency, an operator who has completed a total of 60 hours of practice across all levels of a given pre-set may be assigned a user score of 8 (e.g., a high score reflecting high proficiency) based on a pre-established mapping function, while an operator who has completed a total of 5 hours of practice across all levels of the pre-set may be assigned a user score of 1 (e.g., a low score reflecting low proficiency).
- the scores of the pre-set chart 200 may be updated each time an operator performs an ultrasound exam with the ultrasound imaging system. For example, as the number of ultrasound exams that an operator performs increases, the proficiency parameters of number of exams and/or number of practice hours may be incremented.
- the operator's level scores 208 and/or user scores 210 may be adjusted to reflect changes in the operator's proficiency over time. For example, an operator who has achieved a passing level score 208 for Level 1 corresponding to a proficiency parameter of scan plane accuracy may be assigned a new passing level score 208 for Level 2, reflecting that the operator has developed an increased proficiency in acquiring images within a target scan plane.
- a level score 208 of the operator corresponding to a proficiency parameter of image quality may not increase as the number of practice hours increases, reflecting that the operator's increased proficiency in adjusting the ultrasound probe to acquire images within a target scan plane does not translate into an increased ability to generate high quality images (e.g., because different skills may be involved).
- the pre-set chart 200 may also be updated periodically to reflect changes in the underlying proficiency rubric. For example, additional proficiency parameters 206 may be added, removed, combined, or split into two or more new proficiency parameters 206 .
- liver pre-set chart view 230 is shown, where the liver pre-set chart view 230 is an example of the pre-set chart view 200 of FIG. 2 A specific to an anatomical target of a liver, as shown in the anatomical region label 202 and the anatomical target label 204 .
- example criteria specific to a liver preset are provided for each proficiency parameter 206 .
- an example criterion for passing level 1 is that the operator is able to visualize a structure of a target anatomical feature
- an example criterion for passing level 5 is that the operator is able to visualize a structure of a target anatomical feature, and that the operator is able to visualize an entire organ of the target anatomical feature within a duration of less than 1 minute.
- Criteria for levels 2, 3, and 4 may include one or more intermediate quality thresholds, where if the operator acquires images above the one or more intermediate quality thresholds, then the operator passes the relevant level.
- example criteria for the proficiency parameter of acquisition speed may include a threshold duration for each level.
- the operator may be assigned a passing score for level 1 if the operator is able to acquire a suitable image within 3 minutes
- the operator may be assigned a passing score for level 2 if the operator is able to acquire a suitable image within 2.5 minutes
- the operator may be assigned a passing score for level 3 if the operator is able to acquire a suitable image within 2 minutes, and so on.
- Example criteria for the proficiency parameter of scan plane accuracy may include different qualitative and/or quantitative criteria for levels 1-5, for example, the operator may be assigned a passing score for level 1 if the operator is able to identify a target scan plane, and the operator may be assigned a passing score for level 2 if the operator is able to identify a target scan plane and adjust an ultrasound probe accordingly, while passing scores for levels 3, 4, and 5 may be assigned based on a duration taken by the operator to acquire a scan plane accurately. Criteria for levels 1-5 for the proficiency parameters of number of practice hours and number of exams may include achieving a threshold number of practice hours and achieving a threshold number of exams.
- the operator may be assigned a level score of 1.0 for level 1 for the proficiency parameter of number of practice hours if the operator achieves more than four practice hours, the operator may be assigned a level score of 1.0 for level 2 for the proficiency parameter of number of practice hours if the operator achieves more than six practice hours, and so on.
- the operator may be assigned a level score based on a duration taken to acquire an image reproducibly (e.g., images of the same target anatomical feature from different subjects) or a duration taken to acquire an image repeatedly (e.g., images of the same target anatomical feature from the same subject under the same conditions).
- a weight 212 may be associated with each proficiency parameter 206 .
- the proficiency parameter of image quality may have a weight of 1.0
- the proficiency parameter of acquisition speed may have a weight of 0.7
- the proficiency parameter of scan plane accuracy may have a weight of 0.9
- the proficiency parameter of number of practice hours may have a weight of 0.05
- the proficiency parameter of number of exams may have a weight of 0.6
- the proficiency parameter of reproducibility may have a weight of 0.3
- the proficiency parameter of repeatability may have a weight of 0.3.
- the weights of the proficiency parameters may indicate a relative importance of a proficiency parameter with respect to other proficiency parameters.
- the weight 1.0 assigned to the proficiency parameter of image quality may indicate that image quality is of greater relative importance in generating a proficiency score 216 than a number of practice hours, which is assigned a weight of 0.5 (e.g., the number of practice hours achieved by the operator may be less indicative of a proficiency of the operator than an ability of the operator to achieve an image of a desired quality).
- level 1 may be considered more important (e.g., a bigger milestone towards achieving a high proficiency) than levels 2, 3, and 4, which may represent smaller milestones towards achieving a high proficiency).
- the proficiency gap may not be “even” between any two successive levels and therefore a differential priority may be considered by using the bold lines, as indicated above.
- liver pre-set chart view 260 is shown, where the liver pre-set chart view 260 is an example of the pre-set chart view 200 of FIG. 2 A specific to an anatomical target of a liver.
- example weights 212 , level scores 208 , user scores 210 , and an example proficiency score 216 are shown, where the example user scores 210 are calculated as a weighted summation of the level scores 208 , and the proficiency score 216 is calculated based on a summation of the user scores 210 .
- the level scores 208 shown under the columns of Level 1-Level 5 may correspond to the qualitative and quantitative criteria for each proficiency parameter 206 and each level.
- a level score of 1.0 under Level 1 for the proficiency parameter image quality may indicate that an operator has fulfilled the criterion of being able to visualize a target structure shown in the example liver pre-set chart view 230 of FIG. 2 B under Level 1 for the proficiency parameter image quality.
- a level score of 1.0 under Level 1 for the proficiency parameter acquisition speed may indicate that the operator has fulfilled the criterion of acquiring a suitable image within 3 minutes shown in the example liver pre-set chart view 230 of FIG. 2 B under Level 1 for the proficiency parameter acquisition speed;
- a level score of 1.0 under Level 1 for the proficiency parameter scan plane accuracy may indicate that the operator has fulfilled the criterion of identifying a target scan plane shown in the example liver pre-set chart view 230 of FIG.
- liver pre-set chart view 260 the operator has level scores of 1.0 for the proficiency parameters 208 of Level 1 and Level 2, indicating that the operator has passed Level 2 based on the criteria shown in example liver pre-set chart view 230 .
- Pre-set chart view 260 further shows that the operator has a level score of 1.0 under Level 3 for the proficiency parameter of image quality, a level score of 1.0 under Level 3 for the proficiency parameter of scan plane accuracy, and a level score of 1.0 under Level 3 for the proficiency parameter of repeatability.
- the operator has achieved an image quality that meets a desired threshold quality associated with Level 3, a scan plane accuracy that meets a desired threshold accuracy associated with Level 3, and a repeatability that meets a desired threshold repeatability associated with Level 3.
- the operator has a level score of 0.0 under Level 3 for the proficiency parameters of acquisition speed, scan plane accuracy, and repeatability, indicating that the operator has not yet met one or more criteria for the proficiency parameters of acquisition speed, scan plane accuracy, and repeatability under Level 3. For example, it may be concluded that the operator has not practiced enough and/or been exposed to a sufficient number of exams to achieve a desired acquisition speed and reproducibility to pass Level 3, while the operator may have a high degree of natural ability in acquiring images.
- a user score 210 may be calculated for the operator as a function of the level scores 208 and the weights 212 .
- the user scores 210 are calculated as a weighted summation of the level scores 208 for each proficiency parameter.
- the operator may be assigned a user score of 3.0 for the proficiency parameter image quality, reflecting a summation of the level score of 1.0 under level 1, the level score of 1.0 under level 2, the level score of 1.0 under level 3, the level score of 0.0 under level 4, and the level score of 0.0 under level 5, which is then multiplied by the weight 1.0 associated with the proficiency parameter image quality.
- the operator may be assigned a user score of 1.8 for the proficiency parameter acquisition speed, reflecting a summation of the level score of 1.0 under level 1, the level score of 1.0 under level 2, the level score of 0.0 under level 3, the level score of 0.0 under level 4, and the level score of 0.0 under level 5, which is then multiplied by the weight 0.9 associated with the proficiency parameter image quality.
- User scores for the remaining proficiency parameters shown in FIG. 2 C are calculated in the same fashion, yielding user scores for each proficiency parameter as shown in the user score column of FIG. 2 C .
- a proficiency score 216 may be calculated based on the user scores 210 , which may reflect an overall proficiency of the operator across a plurality of proficiency parameters.
- the proficiency score 216 may be calculated as a summation of the user scores for the proficiency parameters image quality, acquisition speed, scan plane accuracy, number of practice hours, number of exams, reproducibility, and repeatability.
- the proficiency score 216 may be calculated as an average of the user scores for the proficiency parameters image quality, acquisition speed, scan plane accuracy, number of practice hours, number of exams, reproducibility, and repeatability, or a different function of the user scores for the proficiency parameters image quality, acquisition speed, scan plane accuracy, number of practice hours, number of exams, reproducibility, and repeatability.
- the proficiency score 216 may be calculated as a function of the user scores for a smaller number of proficiency parameters.
- the proficiency score 216 of an expert operator may be calculated as a function of the user scores for the proficiency parameters image quality, acquisition speed, scan plane accuracy, reproducibility, and repeatability, but not for the proficiency parameters number of practice hours or number of exams. It should be appreciated that the examples provided herein are for illustrative purposes, and other functions and/or methods for calculating the user score 210 or the proficiency score 216 may be included without departing from the scope of this disclosure.
- a plurality of pre-set charts 200 may be associated with a single user of an ultrasound imaging system, where each of the plurality of pre-set charts 200 represents a rubric for assigning a proficiency score to the operator in reference to the operator's performance of an ultrasound exam for the anatomical target associated with the pre-set chart 200 .
- an operator's total experience in performing ultrasound examinations on a given pre-set e.g., a liver pre-set
- an operator's overall experience in performing ultrasound examinations across all available pre-sets may be measured by assigning an overall proficiency score to the operator, where the overall proficiency score may be a function of the operator's pre-set proficiency scores on each of the plurality of pre-set charts 200 .
- a collection of pre-sets represents a structure or framework for capturing and tracking the evolution of the ultrasound operator's proficiency as the operator gains experience over time, broken down by target anatomical feature, with respect to various criteria.
- This framework provides a basis for automated guidance to be provided to an ultrasound operator in accordance with the operator's abilities in order to facilitate an increase in the operator's proficiency in acquiring ultrasound images.
- the proficiency scores captured in the pre-sets may be leveraged such that the type, style, and/or amount of automated guidance may be adjusted based on the operator's proficiency scores.
- a processor of the ultrasound imaging system may request from a database (e.g., the proficiency database 132 of the ultrasound imaging system 100 of FIG. 1 ) a proficiency score of the operator in accordance with a liver pre-set.
- the operator's proficiency score on the liver pre-set reflects the operator's proficiency in administering ultrasound exams of livers, based on the operator's experience and skill level as described above.
- the processor may display visual guidance cues on a display device of the ultrasound system (e.g., the display device 118 of the ultrasound system 100 of FIG. 1 ) to aid the operator in achieving the right scan plane for acquiring images of relevant features of the patient's liver.
- the visual guidance cues may include, by way of illustration, one or more display elements (e.g., an arrow, a symbol, etc.) indicating a direction of and/or amount of pressure to apply to the probe and/or a direction to which to adjust a position of the probe, textual instructions to adjust one or more scan settings of the ultrasound imaging system, etc.
- the guidance cues may be displayed in conjunction with a pre-established reference ultrasound image of a liver in the target scan plane, which the operator may compare to the images being acquired, or a textbook illustration of a liver with annotated information, or a 3D model of a liver, or reference images of a different kind. It should be appreciated that the examples provided herein are for illustrative purposes and any type of guidance cues may be included without departing from the scope of this disclosure.
- the user guidance may include contextual guidance in the form of reference images, such as pre-acquired reference ultrasound images, anatomical illustrations, etc.
- Contextual user guidance may also include high-quality ultrasound images pre-acquired by an experienced sonologist/radiologist from an earlier patient, or images generated by a remote assistant connected through a video call.
- the user guidance may also include real-time guidance, in the form of cues for probe placement and/or orientation (e.g., graphical display elements such as arrows, lines, indicators, etc.), textual instructions, prompts or check-lists of protocol steps, and so forth.
- the real-time guidance and/or the contextual guidance may be displayed in conjunction with or superimposed upon images acquired in real time by the ultrasound probe.
- the real-time guidance may also be displayed in conjunction with or superimposed upon the contextual guidance (e.g., reference images), which in turn may be displayed in conjunction with or superimposed upon images acquired in real time by the ultrasound probe.
- the contextual guidance e.g., reference images
- different forms of contextual and real-time user guidance may be combined and displayed dynamically in conjunction with the images being acquired by the ultrasound probe, in accordance with one or more AI algorithms.
- the processor may also leverage an operator's proficiency scores to determine a duration for which to display user guidance.
- the processor may display user guidance to an operator with a low proficiency score in acquisition speed on a liver pre-set earlier (e.g., after a shorter waiting period) than an operator with a high proficiency score in acquisition speed on a liver pre-set.
- the processor may stop displaying guidance to an operator with a low proficiency score in acquisition speed on a liver pre-set later than an operator with a high proficiency score in acquisition speed on a liver pre-set.
- the displaying and timing of user guidance by the processor is described in more detail below in reference to FIGS. 3 A and 3 B . Referring now to FIG.
- Method 300 a flowchart illustrates an example method 300 for displaying user guidance to a user on an ultrasound display device, based on a proficiency score of the user and/or a selection of the user.
- Method 300 is described with regard to the systems and components of FIG. 1 , though it should be appreciated that the method 300 may be implemented with other systems and components without departing from the scope of the present disclosure.
- Method 300 may be carried out by a processor, such as the processor 116 of the ultrasound imaging system 100 of FIG. 1 , in accordance with instructions stored in non-transitory memory of a computing device, such as memory 120 of ultrasound imaging system 100 of FIG. 1 .
- an ultrasound operator logs into an ultrasound system using login credentials, and method 300 includes initiating operation of the ultrasound system with the login credentials of the operator.
- method 300 includes prompting the operator to select a desired pre-set/anatomy of interest to begin scanning.
- the pre-set may be the same as, or similar to, the pre-set described by pre-set charts 200 , 230 , and 260 of FIG. 2 .
- the pre-set is loaded into a memory of the ultrasound imaging system (e.g., the memory 120 of FIG. 1 ), whereby the processor may access the operator's proficiency scores on the selected pre-set, and scanning is initiated by the operator.
- the ultrasound images may be acquired with an ultrasound probe (e.g., the ultrasound probe 106 of ultrasound imaging system 100 of FIG. 1 ) and displayed to an operator via a display device (e.g., the display device 118 of ultrasound imaging system 100 of FIG. 1 ).
- the images may be acquired and displayed in real time or near real time, and may be acquired with default or user-specified scan parameters (e.g., default depth, frequency, etc.).
- the ultrasound images may be acquired as part of an ultrasound exam where certain anatomical features are imaged in certain views/axes in order to diagnose a patient condition, measure aspects of the anatomical features, etc.
- one or more target scan planes also referred to as views of the heart of a patient may be imaged.
- the target scan planes may include a four-chamber view, a two-chamber view (which may also be referred to as a short axis view), and a long axis view (which may also be referred to as a PLAX view or three-chamber view.
- method 300 includes accessing the last saved proficiency score for the current pre-set from a database (e.g., the database 132 of ultrasound imaging system 100 of FIG. 1 ).
- the proficiency score may be displayed on the display device such that the proficiency score is visible to the operator. In other embodiments, the proficiency score may not be displayed, or may be displayed upon request by the operator.
- the proficiency score may be dynamically updated. For example, the operator's proficiency score on a pre-set may increase as the proficiency parameter of practice hours increases, or the operator's proficiency score on the pre-set may increase as a result of achieving a higher proficiency score than the operator's previous proficiency scores for the proficiency parameter of acquisition speed (e.g., representing faster acquisition speed), or another proficiency parameter.
- method 300 includes generating real-time user guidance based on real-time image acquisition via an AI algorithm.
- the AI algorithm may be running continuously whether the user guidance is turned ON or OFF.
- the AI algorithm may be utilized to monitor how a user is performing during the current exam to determine if and when the use guidance can be turned ON.
- adaptive user guidance based on user proficiency may be disabled consciously by the user by using a set-up utility and in that case the AI algorithm that is used to generate a graphical interface of the real-time guidance may stop running till such time the user does not turn it ON manually.
- the real-time guidance feature may be enabled as a default option or a manually activated option.
- the AI algorithm based on which the graphical interface of real-time guidance may be enabled prior to user's log (that is, prior to step 302 ).
- the AI algorithm may be a rules-based expert system, in which different forms of user guidance (e.g., contextual guidance and/or real-time guidance cues) are generated as appropriate for the operator based the application of one or more rules within a decision tree structure.
- the AI algorithm may be a machine learning algorithm.
- a user's proficiency score (e.g., the proficiency score 216 of pre-set charts 200 , 230 , and 260 of FIG. 2 ) is below a first threshold value
- the AI algorithm may generate user guidance intended for helping a novice ultrasound operator. If the user's proficiency score is below a second threshold value, then the AI algorithm may generate user guidance intended for helping an intermediate ultrasound operator. If the user's proficiency score is below a third threshold value, then the AI algorithm may generate user guidance intended for helping an advanced ultrasound operator, and so forth.
- user guidance may be displayed based on a parameter-specific proficiency score (e.g., the user score 210 of pre-set charts 200 , 230 , and 260 of FIG. 2 ). For example, if a user's score on the proficiency parameter of image quality is below a threshold quality (e.g., a target structure is not fully visualized, etc.), and the user's score on the proficiency parameter of number of practice hours is below a threshold number of hours, then the AI algorithm may generate user guidance intended for helping a novice ultrasound operator improve image quality.
- a threshold quality e.g., a target structure is not fully visualized, etc.
- an AI algorithm may generate guidance cues intended to aid a more experienced ultrasound operator reduce acquisition time.
- an AI algorithm may generate, enable, and/or disable different forms of user guidance based on a combination of factors, including proficiency scores or scores associated with individual proficiency parameters (e.g., the user scores 210 of FIG. 2 ).
- method 300 includes determining whether a proficiency score of the user exceeds a minimum threshold score.
- the ultrasound system may be initialized to establish a baseline proficiency score of the user.
- the baseline proficiency score may be determined based on the user's credentials, including qualifications and prior experience with ultrasound imaging.
- the method may include determining if the proficiency score for the desired pre-set and/or the anatomy of interest is greater than the minimum threshold score for the desired pre-set and/or anatomy of interest.
- method 300 includes displaying the user guidance generated at 308 on a display device of the ultrasound system (e.g., the display device 118 of ultrasound imaging system 100 of FIG. 1 ), thereby ensuring that low proficiency users are provided guidance.
- a display device of the ultrasound system e.g., the display device 118 of ultrasound imaging system 100 of FIG. 1
- method 300 proceeds to 312 .
- method 300 includes prompting the user whether to receive guidance, and receiving a response. Proceeding to 314 , method 300 includes determining whether the user opts for guidance to be displayed on the device screen. If the user opts for guidance to be displayed on the display device at 360 , method 300 proceeds to 316 .
- method 300 includes displaying the user guidance generated at 308 on the display device. The user guidance may include real-time user guidance and/or contextual user guidance, for example. If the user does not opt for guidance to be displayed on the device screen at 314 , method 300 proceeds to 318 .
- method 300 includes prompting the user whether to exit the pre-set (e.g., at the completion of the examination).
- method 300 includes determining whether the user wishes to exit the current pre-set. If the user does not wish to exit the current pre-set at 320 , method 300 proceeds back to 308 , and user guidance continues to be generated dynamically based on real-time image acquisition via an AI algorithm as described above. Alternatively, if the user wishes to exit the current pre-set at 320 (e.g., once the ultrasound examination has been completed), method 300 proceeds to 322 .
- a lesser amount of user guidance when the proficiency score for the user is greater than the threshold score, a lesser amount of user guidance may be provided; and when the proficiency score for the user is less than the threshold score, a greater amount of user guidance may be provided.
- the lesser amount of user guidance may include a lesser amount of real-time guidance or a lesser amount of contextual guidance or a lesser amount of real-time and contextual guidance.
- the greater amount of user guidance may include a greater amount of real-time guidance, or a greater amount of contextual guidance, or a greater amount of real-time and contextual guidance.
- the greater amount of real-time guidance may include a maximum amount of real-time and/or contextual guidance.
- more than one proficiency threshold may be applied.
- a greater amount of user guidance may be provided; when the user's proficiency score is greater than the first threshold but less than the second threshold, a lesser amount of user guidance may be provided; and when the user's proficiency score is greater than the second threshold, user guidance may not be automatically provided but the user with the proficiency score greater than the second threshold may have the option to turn ON user guidance, and may be able to select the greater or lesser amount of user guidance.
- the user may have the option of increasing the amount of guidance.
- method 300 includes computing one or more of an updated proficiency score of the user for the current pre-set, and saving the one or more of an updated proficiency score of the user to a database.
- the updated proficiency score may be a proficiency score calculated as a function of one or more scores corresponding to individual proficiency parameters (e.g., the proficiency score 216 , based on the user scores 210 of the example pre-sets 200 , 230 , and 260 of FIG. 2 ).
- the updated proficiency score may be a score assigned to a specific proficiency parameter (e.g., an updated user score 210 of FIG. 2 ), or one or more parameter-specific proficiency scores (e.g., user scores 210 ) and an overall proficiency score (e.g., proficiency score 216 ) may be updated.
- the updated proficiency score may be a moving average of a new proficiency score of the user on the examination with one or more historical user proficiency scores from previous ultrasound examinations performed by the user on the same pre-set.
- the updated proficiency score may be determined as a result of a different function of the user's proficiency score on the examination and one or more historical user proficiency scores from previous ultrasound examinations performed by the user on the same pre-set.
- the updated proficiency score may be saved to a database such as the proficiency database 132 of ultrasound imaging system 100 of FIG. 1 .
- the updated proficiency score may overwrite or replace a previous proficiency score, such that a single proficiency score is always stored that represents the current proficiency of the user.
- new proficiency scores that are generated may be stored in the database, whereby an updated proficiency score may be calculated dynamically, on demand, based on the user's individual proficiency scores collected over time for each examination performed.
- the new proficiency score and the updated proficiency score may both be recorded in the database. It should be appreciated that the determination and recording of the user's proficiency score(s) is described herein for illustrative purposes, and proficiency scores for the user may be determined and recorded in other ways without departing from the scope of this disclosure.
- method 300 proceeds to 324 .
- method 300 includes prompting the user whether they wish to end the session.
- method 300 includes determining from the user's response whether the user wishes to end the session. If the user wishes to end the session at 326 , method 300 proceeds to 328 .
- method 300 includes saving the current session data, and method 300 ends. Alternatively, if the user does not wish to end the session at 326 , method 300 proceeds back to 304 , and the user is prompted for a desired pre-set/anatomy of interest to reinitiate scanning, as described above.
- method 330 for automatically generating user guidance, after a duration, to a user on a display device such as the display device 118 of ultrasound system 100 of FIG. 1 .
- method 330 may monitor user's current imaging proficiency via one or more proficiency parameters (e.g., a current speed of acquisition to achieve a desired image having a desired image quality and/or achieving a desired scan plane for the current pre-set), and may automatically activate user guidance when one or more current imaging proficiency conditions (e.g., current acquisition speed is less than a threshold duration taken to acquire desired image, repeatedly not achieving desired scan plane, etc.) are not satisfied.
- one or more current imaging proficiency conditions e.g., current acquisition speed is less than a threshold duration taken to acquire desired image, repeatedly not achieving desired scan plane, etc.
- method 330 may further generate a first user guidance prompt indicating that the user is taking excessive duration to acquire a desired image and a duration, which may include a countdown timer, before the user guidance is automatically turned ON.
- Method 330 may be carried out by a processor, such as the processor 116 of the ultrasound imaging system 100 of FIG. 1 .
- a different proficiency parameter other than acquisition speed may be monitored to determine whether the user is struggling. For example, if a scan plane of an acquired image is not of a desired quality (e.g., if a difference between a scan plane and a target scan plane exceeds a threshold difference), user guidance may automatically be displayed to the user.
- a scan plane of an acquired image is not of a desired quality (e.g., if a difference between a scan plane and a target scan plane exceeds a threshold difference)
- user guidance may automatically be displayed to the user.
- Step 332 to 340 of method 330 may be the same as or similar to steps 300 to 310 of method 300 .
- method 330 includes initiating operation of an ultrasound device with login credentials of the user.
- method 330 includes prompting the user for a desired pre-set/anatomy of interest to begin scanning and receiving a response back from the user.
- method 330 includes recalling and indicating the last save proficiency score for the current pre-set from the database.
- method 330 includes generating user guides based on real-time image acquisition via an AI algorithm.
- method 330 includes determining whether a proficiency score of the user exceeds a minimum threshold proficiency score.
- method 330 includes determining whether excessive time is being taken to acquire a suitable image (e.g., of a desired quality). For example, for a liver pre-set, a predetermined threshold duration (e.g., 60 seconds) may be established for acquiring an image of a liver above a threshold quality (for example, as described above in relation to FIGS. 2 A- 2 C ) and in a correct scan plane. If an amount of time taken by the user exceeds the predetermined threshold duration, it may be determined (e.g., by the processor) that excessive time is being taken to acquire an image. Alternatively, if the amount of time taken by the user does not exceed the predetermined threshold duration, it may be determined that excessive time is not being taken to acquire an image.
- a predetermined threshold duration e.g. 60 seconds
- method 330 determines at 342 that excessive time is not being taken to acquire a suitable image, method 330 proceeds to 344 .
- method 330 includes prompting the user whether to exit the pre-set (e.g., at the completion of the examination). Alternatively, if method 330 determines at 342 that excessive time is being taken to acquire a suitable image, method 330 proceeds to 344 .
- method 330 includes notifying the user that guidance is being turned on after the duration has elapsed.
- an ultrasound operator with an intermediate level of experience and a proficiency score of 5.0, out of a maximum proficiency score of 10.0, may be performing a current ultrasound examination via an ultrasound imaging system such as ultrasound imaging system 100 of FIG. 1 .
- the operator's proficiency score of 5.0 may be above a pre-established minimum threshold proficiency score of 4.0 at 354 , meaning that user guidance is not displayed by default on the display device.
- the operator may have more difficulty acquiring a suitable image than usual (e.g., because of a patient's body type, age, the operator's degree of fatigue, etc.).
- a processor of the ultrasound imaging system e.g., the processor 116 of FIG.
- the threshold reference duration may be established based on an amount of time allocated for a patient examination, an expected duration based on the user's proficiency level, an amount of time taken to acquire a suitable image by an expert operator, the type of ultrasound examination, a historical evolution of the user's proficiency over time, and/or any other relevant factors or combination of factors.
- the predetermined time period may be established based on historical data and/or offline studies. In other examples, a different proficiency parameter other than speed may be monitored.
- method 330 includes displaying the user guidance on the display device after the predetermined time period has elapsed (e.g., 10 seconds).
- the predetermined time period e.g. 10 seconds.
- An example notification that user guidance will be turned on is shown in FIG. 6 C and described in greater detail below.
- the first prompt or the notification that the user guidance will be turned ON may be provided as a transparent or translucent overlay on a portion of the real-time acquired image.
- Steps 348 to 358 of method 330 may be the same as or similar to steps 318 to 328 of method 300 .
- method 330 includes prompting the user whether to exit the pre-set.
- method 330 includes determining whether the user wishes to exit the current pre-set. If the user does not wish to exit the current pre-set at 350 , method 330 proceeds back to 338 , and guidance continues to be generated based on real-time image acquisition via an AI algorithm. If the user does wish to exit the current pre-set at 350 , method 330 proceeds to 352 .
- method 330 includes computing and saving an updated proficiency score of the user for the current pre-set to a database.
- method 330 includes prompting the user whether to end the session.
- method 330 includes determining whether the user wishes to end the session based on a response from the user at 354 . If the user does not wish to and the session at 356 , method 330 includes proceeding back to 334 , and the user is prompted for a new desired pre-set to begin scanning. Alternatively, if the user wishes to end the session at 356 , method 330 proceeds to 358 . At 358 , method 330 includes saving the current session data, and method 330 ends.
- a greater amount e.g., lesser amount of guidance increased to maximum user guidance
- the lesser amount of user guidance may be real-time user guidance or contextual user guidance, while greater amount of user guidance may include both real-time and contextual user guidance.
- lesser amount of user guidance may include lesser amount of real-time guidance and/or lesser amount of contextual guidance while greater amount of user guidance may include greater amount of real-time guidance and/or greater amount of contextual guidance.
- the lesser amount of real-time guidance may include one or more but not all of probe-pose adjustment graphical cues, textual instructions indicating current/subsequent steps to be performed, and automatic AI based acquisition of images, while greater amount of real-time guidance may be based on the lesser amount of real-time guidance and may include all of the real-time guidance types indicated above.
- lesser amount of contextual guidance may not include all of the contextual guidance types including pre-acquire image showing desired scan plane, graphical illustration (e.g., text book illustration) of the related anatomy, and educational video for the related exam, while the greater amount of contextual guidance may be based on the lesser amount of contextual guidance and may include all of the contextual guidance types indicated above.
- the first prompt may include an indication that the user guidance will be automatically increased. Further, the first prompt may additionally or alternatively include an indication of the type of guidance that may be additionally provided. That is, the first prompt may indicate whether real-time guidance is increased or contextual guidance is increased or both are increased. Furthermore, in some examples, the first prompt may specify which type of real-time guidance (e.g., probe-pose adjustment graphical cues, textual instructions indicating current/subsequent steps to be performed, and automatic AI based acquisition of images) and/or which type of contextual guidance (e.g., pre-acquired image, graphical illustration (e.g., text book illustration) of the related anatomy, and educational video for the related exam) will be automatically applied.
- type of real-time guidance e.g., probe-pose adjustment graphical cues, textual instructions indicating current/subsequent steps to be performed, and automatic AI based acquisition of images
- type of contextual guidance e.g., pre-acquired image, graphical illustration (e.g
- an example method 370 is shown for displaying an option for activating user guidance to a user on a display device, such as the display device 118 of ultrasound system 100 of FIG. 1 , during imaging conditions when excessive time is being taken to acquire an ultrasound image. For example, when a proficiency score for the user performing the imaging is greater than a threshold proficiency, the user guidance may not be initially displayed. However, during certain imaging conditions, such as when a duration to obtain a desired quality image is greater than a threshold duration (e.g., threshold duration based on a proficiency level, such as proficiency level 208 at FIG. 2 ), an option to activate user guidance and/or provide additional user guidance may be provided as discussed below.
- a threshold duration e.g., threshold duration based on a proficiency level, such as proficiency level 208 at FIG. 2
- Method 370 may be carried out by a processor, such as the processor 116 of the ultrasound imaging system 100 of FIG. 1 , in accordance with instructions stored in non-transitory memory of a computing device, such as memory 120 of ultrasound imaging system 100 of FIG. 1 .
- a different proficiency parameter other than acquisition speed may be monitored to determine whether the user is struggling. For example, if it is determined that a user is unable to achieve a target scan plane repeatedly on the same patient, the user may be prompted whether user guidance should be displayed to the user.
- Step 372 to 382 of method 370 may be the same as or similar to steps 300 to 310 of method 300 .
- method 370 includes initiating operation of an ultrasound device with login credentials of the user.
- method 370 includes prompting the user for a desired pre-set/anatomy of interest to begin scanning and receiving a response back from the user.
- method 370 includes recalling and indicating the last save proficiency score for the current pre-set from the database.
- method 370 includes generating user guides based on real-time image acquisition via an AI algorithm.
- method 370 includes determining whether a proficiency score of the user exceeds a minimum threshold proficiency score.
- method 370 proceeds to 388 .
- method 370 includes displaying the user guidance on the display device.
- the user guidance may include real-time user guidance and/or contextual user guidance.
- method 370 proceeds to 382 .
- method 370 includes determining whether excessive time is being taken to acquire the image.
- method 370 proceeds to 390 .
- the user is prompted whether to exit the pre-set, as described above in relation to methods 300 and 330 of FIGS. 3 A and 3 B .
- method 370 proceeds to 384 . Parameters for determining whether excessive time is taken to acquire the desired image is discussed at FIG. 3 B , and will not be repeated for the sake of brevity.
- method 370 includes prompting the user whether to turn ON the user guidance, and receiving a response from the user.
- a prompt may be displayed on the display portion of the graphical user interface.
- the prompt may include one or more indications including an indication that the user taking excessive time to acquire the desired image and/or scan plane and requesting confirmation to turn ON user guidance.
- the one or more indications may include a first control button to confirm turning ON user guidance and a second control button to cancel the prompt and/or user guidance.
- the one or more indication include one or more graphical indications, such as a timer, clock, etc.
- Example prompt for turning ON user guidance is further described with respect to FIG. 6 D . Further still, as shown in FIG. 6 D , the prompt may be displayed as a translucent or transparent overlay on a portion of the acquired ultrasound image.
- an intermediate proficiency range may exist where it is not clear whether a user would benefit from the display of user guidance or not.
- the prompts may include control buttons to increase user guidance or cancel future increase in user guidance.
- method 370 determines whether the user has opted for guidance to be displayed. If it is determined at 386 that the user has not opted for guidance to be displayed, method 370 proceeds to 390 . At 390 , method 370 includes prompting the user whether to exit the pre-set, as described above in relation to FIGS. 3 A and 3 B . Alternatively, if it is determined at 386 that the user has opted for guidance to be displayed, method 370 proceeds to 388 . At 388 , method 370 includes displaying user guidance on the display device. In this way, a user who is having difficulty acquiring a suitable image may be provided the option of receiving guidance upon request.
- an ultrasound operator with an intermediate level of experience and a proficiency score of 5.0, out of a maximum proficiency score of 10.0 may be performing a current ultrasound examination via an ultrasound imaging system such as ultrasound imaging system 100 of FIG. 1 .
- the operator's proficiency score of 5.0 may be above a pre-established minimum threshold proficiency score of 4.0 at 380 , meaning that user guidance is not displayed by default on the display device.
- the operator may have more difficulty acquiring a suitable image than usual (e.g., because of a patient's body type, age, the operator's degree of fatigue, etc.).
- a processor of the ultrasound imaging system e.g., the processor 116 of FIG.
- the threshold reference duration may be established based on the user's previous performance on similar examinations, an amount of time allocated for a patient examination, an expected duration based on the user's proficiency level, an amount of time taken to acquire a suitable image by an expert operator, and/or any other relevant factors or combination of factors.
- Steps 390 to 399 of method 370 may be the same as or similar to steps 318 to 328 of method 300 .
- method 370 includes prompting the user whether to exit the pre-set.
- method 370 includes determining whether the user wishes to exit the current pre-set. If the user does not wish to exit the current pre-set at 392 , method 370 proceeds back to 378 , and guidance continues to be generated based on real-time image acquisition via an AI algorithm. If the user does wish to exit the current pre-set at 392 , method 370 proceeds to 394 .
- method 370 includes computing and saving an updated proficiency score of the user for the current pre-set to a database.
- method 370 includes prompting the user whether to end the session.
- method 370 includes determining whether the user wishes to end the session based on a response from the user at 396 . If the user does not wish to end the session at 398 , method 370 includes proceeding back to 374 , and the user is prompted for a new desired pre-set to begin scanning. Alternatively, if the user wishes to end the session at 398 , method 370 proceeds to 399 . At 399 , method 370 includes saving the current session data, and method 370 ends.
- methods 300 , 330 , and 370 illustrate how contextual information and real-time guidance cues may be alternatively enabled or disabled, with or without notification to the user, and timed in order to provide customized guidance to an ultrasound operator based on their proficiency and/or a difficulty with which they are attempting to acquire ultrasound images.
- a process for progressively assessing the operator's proficiency and updating the operator's proficiency score is provided, such that the user guidance may be provided in a way that is helpful to operators at lower experience levels, yet not a distraction to operators with higher experience levels.
- the methods 300 , 330 , and/or 370 may be adjusted, and/or steps of the methods 300 , 330 , and/or 370 may be combined to generate methods for training purposes.
- automated training may be provided to a user whereby the user is instructed to follow a series of procedural steps.
- the proficiency score of the user may be used to determine what procedural steps to include, and/or how the procedural steps might be displayed, and/or how long the procedural steps may be displayed on a screen of the display device.
- one or more types of user guidance may be displayed as part of a predetermined sequence to be followed by the user, where the timing and/or other characteristics of the sequence may be adjusted based on the proficiency score of the user.
- a novice user with a low proficiency score may be presented with a set of instructions to follow, and provided a first duration to complete the instructions
- an intermediate user with a higher proficiency score may be presented with a set of instructions to follow, and provided a second, longer duration to complete the instructions.
- user guidance may include contextual user guidance and/or real-time user guidance.
- Contextual user guidance includes reference elements such as images, videos, etc. that provide the user with reference information that is not time dependent, and that may be consulted to aid the user in acquiring ultrasound images.
- real-time user guidance includes time-dependent visual elements such as probe adjustment cues, textual instructions, prompts, checklists, AI triggered automated images acquired in a live session, and/or other visual elements that provide aid to a user at a specific moment in time based on the real-time display of ultrasound images.
- FIGS. 4 A, 4 B, and 4 C show examples of contextual user guidance
- FIGS. 5 A and 5 B show examples of real-time user guidance.
- example contextual guidance displays are shown that may be displayed to a user on a display device of an ultrasound imaging system such as the ultrasound imaging system 100 of FIG. 1 .
- the example contextual guidance displays may be displayed concurrently with ultrasound images acquired in real time via an ultrasound probe (e.g., the ultrasound probe 106 of ultrasound system 100 of FIG. 1 ), such that the user may consult the example contextual guidance displays while adjusting the ultrasound probe to acquire a suitable image.
- the contextual guidance displays may be generated automatically by an AI algorithm running in a processor of the ultrasound imaging system (e.g., the processor 116 of the ultrasound system 100 of FIG. 1 ).
- FIG. 4 A shows an example contextual reference guidance display 400 .
- the contextual guidance display 400 may include a pre-acquired reference ultrasound image 402 , which may be acquired from a prior patient examination.
- the reference ultrasound image 402 may be a target image of an anatomical feature that the user is attending to acquire, which the user may compare to an image generated in real time as result of adjusting a position and/or orientation of an ultrasound probe (e.g., the ultrasound probe 106 of the ultrasound imaging system 100 of FIG. 1 ).
- FIG. 4 B shows an example contextual guidance display 430 , which may include a reference anatomical illustration 432 .
- the reference anatomical illustration 432 may include a graphical illustration of an anatomical feature of which the user is attempting to acquire ultrasound images, taken from an external source such as a textbook, online reference materials, 3-D model, etc.
- the reference anatomical illustration may include a full representation of the target anatomical feature, or the reference anatomical illustration 432 may include a portion of the target anatomical feature (e.g., a relevant section or cross-section, a partial view, etc.).
- the reference anatomical illustration 432 may be used by the user to identify and/or locate one or more anatomical structures of the target anatomical feature. For example, a user attempting to acquire ultrasound images of a spleen may be displayed a graphical illustration of a spleen with component anatomical structures identified via labels, to aid the user in orienting the ultrasound probe.
- FIG. 4 C shows an example contextual guidance display 460 , which may include a video feed 462 that provides guidance to the user.
- the video feed 462 may be a training or educational video that provides instructions regarding how an ultrasound device may be adjusted relative to a body of a patient in order to acquire ultrasound images.
- the video feed 462 may be a live video feed from another operator (e.g., a more experienced operator, a trainer, etc.) that provides customized instruction and/or other information relevant to acquiring ultrasound images.
- the video feed 462 may include audio, which the user may receive via a peripheral device (e.g., speakers, headphones, etc.), or the video feed 462 may include other forms of presenting information, such as slideshows, animations, multimedia presentations, etc.
- example real-time guidance displays are shown that may be displayed to a user on a display device of an ultrasound imaging system such as the ultrasound imaging system 100 of FIG. 1 .
- the example real-time reference guidance displays may be displayed concurrently with ultrasound images acquired in real time via an ultrasound probe (e.g., the ultrasound probe 106 of ultrasound system 100 of FIG. 1 ), such that the user may receive aid in the form of textual or visual instructions that relate to the specific ultrasound images acquired as the user adjusts a position, and/or orientation the ultrasound probe to acquire a suitable image.
- the real-time guidance displays may be generated automatically by an AI algorithm running in a processor of the ultrasound imaging system (e.g., the processor 116 of the ultrasound system 100 of FIG. 1 ).
- the real-time visual guidance display 500 may include a guidance cue 502 , which may indicate to a user a suggested adjustment to the ultrasound device in order to more accurately acquire an ultrasound image.
- the guidance cue 502 may include a visual representation of an ultrasound device with a visual indication of a direction to move the ultrasound device in, an adjustment to be made to the pressure applied to the ultrasound device, etc.
- the guidance cue 502 may be superimposed on a contextual guidance display that includes a reference image.
- the guidance cue 502 may be superimposed on as a view of a 3-D rendering of a target anatomical feature.
- the guidance cue 502 may be superimposed on a graphical illustration of a target anatomical feature (e.g., the contextual guidance display 432 of contextual reference guidance display 430 of FIG. 4 ), or a reference ultrasound image (e.g., the contextual guidance display 402 of contextual reference guidance display 400 of FIG. 4 ), or any other type of image relevant to the acquisition of ultrasound images.
- a target anatomical feature e.g., the contextual guidance display 432 of contextual reference guidance display 430 of FIG. 4
- a reference ultrasound image e.g., the contextual guidance display 402 of contextual reference guidance display 400 of FIG. 4
- FIG. 5 B shows an example real-time reference guidance display 560 , which may include an ultrasound image 562 that may be automatically acquired by an AI algorithm running in a processor of the ultrasound imaging system and displayed for reference purposes to the user.
- the ultrasound image 562 may be an image of a target anatomical feature that the user is attempting to acquire, which the user may compare to an image generated in real time as result of adjusting a position and/or orientation of the ultrasound device.
- a user attempting to achieve a target scan plane for acquiring a suitable ultrasound image of a spleen may acquire ultrasound images that oscillate between the target scan plane and one or more adjacent scan planes, and may encounter difficulty distinguishing the target scan plane from the one or more adjacent scan planes.
- an AI algorithm running in the processor of the ultrasound imaging system may identify the ultrasound image corresponding to the target scan plane, and display the ultrasound image to the user as a guide.
- textual guidance may be provided in terms of written instructions that are displayed on a screen of the display device. In one embodiment, the textual guidance may be displayed on a small and/or unused portion of the screen of the display. In another embodiment, the textual guidance may be superimposed on other elements of the user guidance. For example, textual guidance may be superimposed on the ultrasound image 562 . In one example, textual guidance may be superimposed on the ultrasound image 562 to label one or more anatomical structures. In another example, a plurality of steps of a procedure may be displayed as a reference to the user.
- example user interface views 600 , 620 , 640 , and 660 of an ultrasound imaging system display device are shown.
- the ultrasound display device may be the same as or similar to the display device 118 of ultrasound imaging system 100 of FIG. 1 .
- a display device 602 is shown in the form of a smart phone, which is communicatively coupled to an ultrasound probe such that images acquired by the ultrasound probe are displayed on the display device 602 .
- the display device 602 may be in the form of a computer tablet, a computer monitor, a computer screen of an ultrasound imaging system, a screen of a handheld ultrasound imaging device, or any other type of display screen on which ultrasound images may be displayed.
- the display device 602 includes a display screen portion 604 on which visual display elements may be displayed in real-time, including ultrasound images being acquired, reference images, guidance cues, text, graphical display elements, etc.
- the visual display elements may be generated upon request by a user and/or generated automatically based on an AI algorithm.
- a first indicator field 607 may be displayed on the screen 604 .
- the first indicator field 607 may include elements that are unrelated to the proficiency score of the user.
- the first indicator field 607 may include a power level indicator 605 and/or a pre-set indicator 606 .
- the pre-set indicator 606 may indicate the pre-set in which images are being acquired.
- the pre-set indicator 606 may be a graphical display element such as an icon depicting a target anatomical feature. For example, for a liver pre-set, the pre-set indicator 606 may be an icon depicting a liver.
- the pre-set indicator may be an identifier (e.g., a letter, a code, a number, etc.), or the pre-set indicator may be another type of symbol that indicates to the user the current pre-set loaded onto the display device 602 .
- the screen 604 may further include display of a second indicator field 611 .
- the second indicator field 611 may also include elements that are unrelated to the proficiency score of the user.
- the second indicator field 611 may include an identifying label 610 that may be used to identify an examination being performed and/or an operator.
- the identifying label 610 may include an ID number of the exam, an ID number of the operator, an image of the operator's face, the name of the operator, or any other information that may be used to identify the examination or the operator.
- the second indicator field 611 may include an icon that corresponds to the user or the exam or the pre-set.
- Other visual display elements may include information relating to the proficiency score of the user, or may be displayed as a result of the proficiency score of the user.
- the screen 604 may include a proficiency indicator 608 that indicates the user's proficiency on the current pre-set.
- the proficiency indicator 608 may be a proficiency score of the user, where the proficiency score of the user is calculated as described above in relation to FIG. 2 .
- the proficiency indicator may be a representation of a proficiency score of the user (e.g., expressed as a percentage of a total proficiency, or as a pie/bar chart, etc.), and/or a representation of a proficiency level of the user, where the proficiency level of the user is determined by a mapping function applied to the proficiency score of the user.
- the proficiency indicator 608 may use colors to indicate a proficiency level of the operator, whereby a color may indicate that the user has achieved a proficiency score on a loaded pre-set above a threshold proficiency value associated with the color (e.g., green if the user's proficiency score is above a first threshold proficiency value, blue if the user's proficiency score is above a second threshold proficiency value, red if the user's proficiency score is above a third threshold proficiency value, etc.).
- a threshold proficiency value associated with the color
- the proficiency indicator 608 may be an orange bar to indicate that the user's proficiency score is above a first threshold corresponding to a novice ultrasound operator, while the proficiency indicator 608 may be a green bar to indicate that the user's proficiency score is above a second threshold corresponding to an ultrasound operator of an intermediate level. It should be appreciated that the examples included herein are for illustrative purposes and the proficiency indicator 608 may use another type of display element to indicate a proficiency score and/or proficiency level of the user without departing from the scope of this disclosure.
- the screen 604 may include an image display area 612 , in which ultrasound images acquired via an ultrasound probe (e.g., the ultrasound probe 106 of ultrasound system 100 of FIG. 1 ) may be displayed.
- the ultrasound images are displayed in real time as the user adjusts a position, and orientation, and/or a pressure of the ultrasound probe on a body of a patient.
- ultrasound images of the uterus acquired via the ultrasound probe may be displayed in the display area 612 of the display device 602 .
- the ultrasound images displayed in the display area 612 may shift to reflect a new position, orientation, or pressure of the ultrasound probe.
- the ultrasound images displayed in the display area 612 may rotate in accordance with the adjustment made by the user.
- the image is displayed in the display area 612 may be adjusted horizontally in the same or an opposite direction.
- the display screen 604 may include display of one or more additional elements 613 , 615 , and 617 within a third indicator field 609 that may relate to a current operation of the ultrasound probe.
- the additional elements 613 , 615 , and 617 may be icons that indicate a current scan mode, or may represent controls for enabling the user to perform one or more operations, such as freeze screen or capture image, for example.
- These additional elements 613 , 615 , and 617 may not be related to the proficiency score of the user, and may be displayed on the screen 604 in addition to elements in the first and second indicator fields 607 and 611 , which are also not related to the proficiency score of the user.
- the display screen portion 604 may include display of visual elements that are related to a current proficiency score of the user (e.g., proficiency indicator 608 ) while also including display of one or more visual elements that are unrelated to the proficiency score (e.g., first, second, and third indicator fields 607 , 611 , and 609 , and visual display elements therein).
- a current proficiency score of the user e.g., proficiency indicator 608
- one or more visual elements that are unrelated to the proficiency score e.g., first, second, and third indicator fields 607 , 611 , and 609 , and visual display elements therein.
- FIG. 6 A shows an example configuration of the display device 602 in which user guidance is turned OFF and no user guidance cues or reference images are displayed to the user.
- user interface view 620 shows an example configuration of the display device 602 in which user guidance is turned ON.
- a guidance display 614 is displayed on the screen 604 , on top of and partially obscuring the display area 612 where the ultrasound images being acquired via the ultrasound probe are displayed.
- the guidance display 614 may be displayed as a partially transparent or translucent overlay over a portion of the ultrasound image.
- the guidance display 614 may include one or more contextual guidance reference images.
- the guidance display 614 may include a reference anatomical image 616 of the target anatomical feature (e.g., a liver, spleen, lung, etc.) being scanned, to aid the operator in identifying anatomical structures of the target anatomical feature.
- the reference anatomical image 616 may be a view of a 3-D rendering of the target anatomical feature.
- the reference anatomical image 616 may be a reference ultrasound image acquired previously by the operator or by another operator from the same patient or from a different patient (e.g., the reference ultrasound image 402 of contextual reference guidance display 400 of FIG. 4 A ), or the reference anatomical image 616 may be an image from another source, or any other type of image that may be used to aid the operator and identifying anatomical structures of the target anatomical feature.
- the guidance display 614 may include one or more real-time visual guidance cues 618 .
- a visual guidance cue 618 in the form of an image of an ultrasound probe is superimposed upon a reference image displayed in the guidance display 614 , where a field of view of the probe is indicated along with two arrows indicating a ground plane in reference to which the position of the ultrasound probe may be adjusted.
- the visual guidance cue 618 depicted in FIG. 6 B may aid the operator by indicating a suggested direction in which the position of the ultrasound probe may be adjusted, for example, to achieve a target scan plane, or to bring a target anatomical feature into view, in real-time, while ultrasound images are being acquired and displayed in the display area 612 .
- the guidance display 614 may not be displayed (e.g., as depicted in FIG. 6 A ), and the visual guidance cue 618 may be superimposed on the ultrasound images displayed in the display area 612 (not depicted in FIGS. 6 A- 6 D ).
- the visual guidance cue 618 may be a label with an arrow indicating a specific anatomical feature, where the arrow points to the relevant anatomical feature in the ultrasound images displayed in the display area 612 .
- a plurality of visual display elements may be displayed collectively on a transparent or partially transparent guidance display 614 .
- user interface view 640 shows a configuration of the display device 602 in which a guidance display 614 is displayed on the screen 604 , where the guidance display 614 includes a plurality of textual and visual elements positioned on a partially transparent background.
- FIG. 6 C shows the guidance display portion 614 displaying one or more indications based on a current performance of the user during the current imaging session.
- the guidance display 614 includes a warning label 642 , with text that indicates that the user is taking too long to acquire a suitable ultrasound image.
- User interface view 640 also includes a timer symbol 644 , in the form of an icon representing a clock, and a corresponding timer notification 646 , which indicate to the user that user guidance (e.g., guidance cues, reference images, etc.) will appear within a duration.
- user guidance e.g., guidance cues, reference images, etc.
- the user may be notified that user guidance will be displayed within a second duration to aid the user in acquiring a suitable ultrasound image.
- the second duration may be pre-established, or the duration may be adjusted based on the user's proficiency score, or other factors.
- user interface view 660 shows a configuration of the display device 602 in which a guidance display 614 is displayed on the screen 604 , where the guidance display 614 of user interface view 640 of FIG. 6 C additionally includes a confirmation instruction 662 that prompts the user to confirm whether to turn on user guidance.
- the timer notification 646 of user interface view 640 of FIG. 6 C is replaced by a confirm button 664 and a cancel button 666 , whereby the user may select to confirm the display of user guidance in the guidance display 614 via the confirm button 664 , or select not to display user guidance in the guidance display 614 via the cancel button 666 .
- different display and/or control elements may be used to prompt the user to request, enable, disable confirm, or cancel the display of user guidance, such as checkboxes, radio buttons, etc.
- the user may not be prompted to request, enable, disable, confirm, or cancel the display of user guidance, and user guidance may be automatically displayed, for example, if a proficiency score of the user is below a threshold proficiency value, as described above in relation to FIG. 3 B .
- a user proficiency plot 700 is shown illustrating an improvement in a proficiency of a user of an ultrasound device as a function of practicing duration.
- User proficiency plot 700 includes an example learning curve 702 and an example learning curve 704 .
- learning curves 702 and 704 show the proficiency of a user on a first pre-set and a second pre-set, respectively, where the first pre-set and the second pre-set may correspond to anatomical features for which ultrasound images are being acquired.
- pre-set 1 may correspond to a liver pre-set
- pre-set 2 may correspond to a spleen pre-set.
- other or additional learning curves 702 and 704 corresponding to other pre-sets may be depicted on plot 700 .
- the totality of learning curves depicted on plot 700 may collectively describe an ultrasound operator's proficiency across a plurality of pre-sets covering a variety of different anatomical regions and target features, spanning a time period corresponding to the user's experience, whereby the plot 700 may provide a comprehensive representation of the user's proficiency (e.g., a snapshot).
- Learning curves 702 and 704 are plotted on coordinate axes where a vertical axis 720 corresponds to the user's proficiency score, and a horizontal axis 722 corresponds to a practicing duration in time.
- the learning curves 702 and 704 of user proficiency plot 700 show the user's growth in proficiency as a function of time spent practicing, whereby as practice increases, proficiency grows slowly during a first duration (e.g., when the user is a novice), increases more rapidly during a second duration (e.g., as the user gains experience), and decreases after a threshold proficiency value is achieved (e.g., indicated in FIG. 7 A by proficiency threshold line 710 ), until a point at which no further growth in proficiency is achieved.
- learning curve 702 may depict the user's growing proficiency on a liver pre-set, characterized by an early slow-growth stage, a subsequent faster-growth stage, and a final slow-growth stage after which the user has obtained a high level of proficiency in ultrasound examinations of a patient's liver.
- learning curve 704 may depict the user's growing proficiency on a spleen pre-set, characterized by a similar early slow-growth stage, a subsequent faster-growth stage, and a final slow-growth stage after which the user has obtained a high level of proficiency in ultrasound examinations of a patient's spleen. It should be appreciated that the duration of the slow growth and faster growth stages may be different for different pre-sets, such that the learning curves 702 and 704 may not have the same shape or be described by the same function.
- the proficiency threshold line 710 may indicate a score at which the user may be determined to be proficient.
- the proficiency threshold line 710 may correspond to a user proficiency score of 100, where a user score above 100 indicates proficiency, and a user score below 100 indicates a lack of proficiency.
- the proficiency threshold line 710 may divide the area of the plot into a proficiency stage portion 706 and an early stage portion 708 .
- a user score point 712 on the learning curve 702 may represent a user score of 110, which is above the proficiency threshold line 710 (e.g., representing 100 points), indicating that the user falls within the proficiency stage portion 706 (e.g., indicating that a user is proficient on a pre-set associated with learning curve 702 ).
- a user score point 714 on the learning curve 702 may represent a user score of 15, which is below the proficiency threshold line 710 , indicating that the user falls within the early stage portion 706 (e.g., indicating that the user is not proficient on the pre-set associated with learning curve 702 ).
- the proficiency threshold 710 may be a common threshold that may be applied across all pre-sets. For example, the same proficiency threshold 710 may be utilized to determine whether or not a user is proficient for the first pre-set having learning curve 702 and the second pre-set having corresponding learning curve 704 . Similarly, for additional different pre-sets, that have the same learning curve or different learning curves, the same proficiency threshold may be applied.
- each pre-set may have its own proficiency threshold.
- proficiency threshold 710 may be utilized to determine if the user is proficient with respect to imaging for the first pre-set having learning curve 702
- a second different proficiency threshold 709 may be utilized to determine if the user is proficient for the second pre-set having learning curve 704 .
- different threshold proficiencies may be applied for additional different pre-sets having different learning curves.
- the threshold may be based on a practice duration required to achieve a steady plateau (that is, final slow-growth phase) in the learning curve, the steady plateau phase after an exponential growth phase (that is, intermediate fast-growth phase).
- the threshold for a given pre-set may be based on a corresponding learning curve for the pre-set.
- the proficiency threshold may be lower for pre-sets that have a smaller duration to reach the final slow-growth phase, and the proficiency threshold may be greater for pre-sets that require a longer duration to reach the final slow-growth phase.
- their corresponding proficiency thresholds may be the same.
- the horizontal distance between learning curves 702 and 704 may indicate an order and an amount of time between gaining proficiency in a first pre-set and gaining proficiency in a second pre-set.
- learning curve 702 starts at an earlier time than learning curve 704 (e.g., learning curve 702 is further to the left on the horizontal axis 722 than learning curve 704 ), which may indicate that the user began gaining proficiency on the first pre-set prior to gaining proficiency on the second pre-set; that the user began to gain proficiency on the second pre-set at a time when the user's proficiency on the first pre-set was growing steadily; that the user achieved a high level of proficiency on the first pre-set prior to achieving a high level of proficiency on the second pre-set; and that the user ultimately achieved the same level of proficiency on both pre-sets (e.g., above the proficiency threshold line 710 ).
- a degree of similarity between the learning curves displayed in plot 700 may correspond to a degree to which the skills developed in a first pre-set transfer to a second pre-set.
- the similarity of learning curves 702 and 704 in plot 700 may indicate that the proficiency gained in the first pre-set did not significantly transfer to the second pre-set, since the user was not able to achieve a high level of proficiency on the second pre-set in a shorter duration than it took to achieve a high level of proficiency on the first pre-set (e.g., the learning curves 702 and 704 both extend for an equal horizontal length).
- the user did not achieve steady growth in proficiency on the second pre-set faster (e.g., within a shorter duration) than it took the user to achieve steady growth in proficiency on the first pre-set (e.g., the initial stage of the learning curve 702 has the same shape as the initial stage of the learning curve 704 ).
- the user proficiency plot 700 provides a way to represent the development of an ultrasound operator's proficiency across a plurality of diagnostic ultrasound procedures, over the time that the operator spent practicing.
- This graphical representation may be used to determine what, when, and how long automated user guidance may be provided to operators of varying levels of proficiency, by determining a set of functions that describe growth in proficiency in acquiring ultrasound images over time, and determining how a proficiency growth function that describes proficiency improvement on one pre-set relates to other proficiency growth functions that describe proficiency improvement on other pre-sets.
- an ultrasound operator's proficiency growth function for a liver pre-set may be estimated and/or extrapolated based on a comparison of the operator's historical proficiency scores with the historical proficiency scores of a plurality of operators in performing ultrasound examinations of livers of patients.
- the operator's proficiency growth function for the liver pre-set may allow a processor of an ultrasound system (e.g., the ultrasound system 100 of FIG. 1 ) to extrapolate and estimate feature proficiency scores of the operator as a function of practice time, in order to notify the operator how many estimated hours they might practice to achieve a level of proficiency.
- the processor may select which forms of user guidance to display to the operator based on where the operator's current proficiency score is positioned on the operator's extrapolated learning curve.
- the processor may display user guidance designed to accelerate the acquisition of basic skills. If an operator's proficiency score indicates that the operator is moving beyond an early slow growth stage, the processor may stop displaying user guidance designed to accelerate the acquisition of basic skills, and may display other forms of user guidance designed to improve reproducibility (e.g., an intermediate skill).
- a selection and display of guidance based on a user's position along an estimated learning curve for the user may lead to a more precise customization of guidance than a selection and display of guidance based on a user's proficiency score alone.
- the plot 700 may be used in the development of AI algorithms to determine when and what guidance to display to ultrasound operators.
- historical data collected over a plurality of ultrasound operators for a given pre-set may allow for the determination of a generalized learning curve for that pre-set.
- the historical data may be used to train a machine learning algorithm (e.g., a neural network, etc.) to determine when a specific type of guidance may be displayed in order to maximize a growth in proficiency.
- a neural network trained on historical data may determine that displaying reference ultrasound images is most effective not when a user achieves a certain proficiency score, but rather when a user is approaching an inflection point of the user's learning curve, and may further determine the precise point at which it is most helpful to the user to display a reference ultrasound image.
- a neural network trained on historical data may determine that displaying textbook anatomical images as user guidance is effective at the beginning of an initial slow-growth stage, but is not effective at the end of the initial slow-growth stage, in a way that may not be determined by a user's proficiency score alone (e.g., because the initial slow growth stage may last longer for some users than for others).
- the neural network may determine the precise point at which it is most helpful to display textbook anatomical images to new users.
- the historical data may also be used to generate general guidelines for displaying user guidance that is applicable to all users. For example, it may be determined from historical data collected from a plurality of users that learning curves for one pre-set tend to be shorter than learning curves for a different pre-set (e.g., that users tend to develop proficiency in acquiring images of a uterus faster than acquiring images of a lung, or that users tend to develop proficiency in acquiring images of a spleen faster than acquiring images of a liver, and so forth). As a result, it may be determined that users may gain proficiency across a plurality of pre-sets faster if the users first gain proficiency in pre-sets associated with shorter learning curves.
- the user may obtain a benefit in terms of growth in proficiency by practicing on a plurality of pre-sets in a specified order, for example, where pre-sets with shorter estimated learning curves are practiced before pre-sets with longer estimated learning curves.
- the proficiency assessments carried out as disclosed herein based on proficiency scores assigned in accordance with rubrics established for different pre-sets corresponding to specific anatomical regions and target anatomical features, which are stored and used to generate user-specific learning curves for the different pre-sets, provide a robust framework for dynamically scoring user performance on ultrasound examinations.
- the resulting scores may be used in turn as the basis for automatically generating customized user guidance dynamically, where the selection and timing of the user guidance may be adjusted over time to maximize the impact of the user guidance on increased operator proficiency.
- the articles “a,” “an,” and “the” are intended to mean that there are one or more of the elements.
- the terms “comprising,” “including,” and “having” are intended to be inclusive and mean that there may be additional elements other than the listed elements.
- one object e.g., a material, element, structure, member, etc.
- references to “one embodiment” or “an embodiment” of the present disclosure are not intended to be interpreted as excluding the existence of additional embodiments that also incorporate the recited features.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Biomedical Technology (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Biophysics (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Pathology (AREA)
- Veterinary Medicine (AREA)
- Business, Economics & Management (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Educational Technology (AREA)
- Theoretical Computer Science (AREA)
- Educational Administration (AREA)
- General Business, Economics & Management (AREA)
- Primary Health Care (AREA)
- Epidemiology (AREA)
- Algebra (AREA)
- Chemical & Material Sciences (AREA)
- Mathematical Optimization (AREA)
- Mathematical Physics (AREA)
- Pure & Applied Mathematics (AREA)
- Computational Mathematics (AREA)
- Medicinal Chemistry (AREA)
- Mathematical Analysis (AREA)
- Gynecology & Obstetrics (AREA)
- Human Computer Interaction (AREA)
- Computer Graphics (AREA)
- General Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
Description
Claims (22)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
IN202041041436 | 2020-09-24 | ||
IN202041041436 | 2020-09-24 |
Publications (2)
Publication Number | Publication Date |
---|---|
US20220087644A1 US20220087644A1 (en) | 2022-03-24 |
US12133764B2 true US12133764B2 (en) | 2024-11-05 |
Family
ID=80741299
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/148,376 Active 2042-02-18 US12133764B2 (en) | 2020-09-24 | 2021-01-13 | Systems and methods for an adaptive interface for an ultrasound imaging system |
Country Status (2)
Country | Link |
---|---|
US (1) | US12133764B2 (en) |
CN (1) | CN114246611B (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11716531B2 (en) * | 2021-03-22 | 2023-08-01 | International Business Machines Corporation | Quality of multimedia |
GB202306231D0 (en) * | 2023-04-27 | 2023-06-14 | Intelligent Ultrasound Ltd | Method and apparatus for assisting an ultrasound operator |
CN116843237B (en) * | 2023-09-04 | 2023-11-21 | 贵州惠智电子技术有限责任公司 | Office platform application assessment statistical analysis system |
Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050119568A1 (en) * | 2003-10-14 | 2005-06-02 | Salcudean Septimiu E. | Method for imaging the mechanical properties of tissue |
US8831314B1 (en) | 2013-03-15 | 2014-09-09 | Heartflow, Inc. | Methods and systems for assessing image quality in modeling of patient anatomic or blood flow characteristics |
CN105138250A (en) | 2015-08-03 | 2015-12-09 | 科大讯飞股份有限公司 | Human-computer interaction operation guide method, human-computer interaction operation guide system, human-computer interaction device and server |
US20170360402A1 (en) | 2016-06-20 | 2017-12-21 | Matthew de Jonge | Augmented reality interface for assisting a user to operate an ultrasound device |
US20180365025A1 (en) | 2017-06-16 | 2018-12-20 | General Electric Company | Systems and methods for adaptive user interfaces |
US20190206562A1 (en) * | 2017-12-28 | 2019-07-04 | Ethicon Llc | Method of hub communication, processing, display, and cloud analytics |
US20200113542A1 (en) | 2018-10-16 | 2020-04-16 | General Electric Company | Methods and system for detecting medical imaging scan planes using probe position feedback |
US10628001B2 (en) | 2017-06-16 | 2020-04-21 | General Electric Company | Adapting user interfaces based on gold standards |
US20200160511A1 (en) * | 2018-11-21 | 2020-05-21 | Enlitic, Inc. | Medical scan artifact detection system |
US20200372714A1 (en) * | 2019-05-21 | 2020-11-26 | At&T Intellectual Property I, L.P. | Augmented reality medical diagnostic projection |
US20200402651A1 (en) * | 2019-06-24 | 2020-12-24 | Carefusion 303, Inc. | Adaptive control of medical devices based on clinician interactions |
US20200397511A1 (en) * | 2019-06-18 | 2020-12-24 | Medtronic, Inc. | Ultrasound image-based guidance of medical instruments or devices |
US10885151B2 (en) * | 2015-04-13 | 2021-01-05 | Johnson & Johnson Surgical Vision, Inc. | System and methods for a graphical user interface for conducting ophthalmic surgery |
US20210082567A1 (en) * | 2018-01-22 | 2021-03-18 | Vuno, Inc. | Method for supporting viewing of images and apparatus using same |
US20230181148A1 (en) * | 2020-04-23 | 2023-06-15 | Koninklijke Philips N.V. | Vascular system visualization |
US20230210491A1 (en) * | 2020-06-11 | 2023-07-06 | Koninklijke Philips N.V. | Method for estimating hemodynamic parameters |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
NL1036517C2 (en) * | 2009-02-05 | 2010-08-10 | Holding Prodim Systems B V | DEVICE AND METHOD FOR PLACING CONTOURS OR WORKS AND A MEASURING DEVICE AND DIRECTION DEVICE FURNISHED FOR USE HEREIN. |
US20110301460A1 (en) * | 2010-06-04 | 2011-12-08 | Doris Nkiruka Anite | Self-administered medical ultrasonic imaging systems |
JP2012019341A (en) * | 2010-07-07 | 2012-01-26 | Ricoh Co Ltd | Imaging device, and method and program for controlling the same |
JP5533915B2 (en) * | 2012-03-07 | 2014-06-25 | カシオ計算機株式会社 | Proficiency determination device, proficiency determination method and program |
JP2016015972A (en) * | 2014-07-04 | 2016-02-01 | 富士フイルム株式会社 | Ultrasonic diagnostic equipment and operation method of ultrasonic diagnostic equipment |
CN109475344B (en) * | 2016-03-09 | 2021-10-29 | 皇家飞利浦有限公司 | Fetal imaging system and method |
CN107647880B (en) * | 2016-07-26 | 2021-01-19 | 东芝医疗系统株式会社 | Medical image processing apparatus and medical image processing method |
EP3398519A1 (en) * | 2017-05-02 | 2018-11-07 | Koninklijke Philips N.V. | Determining a guidance signal and a system for providing a guidance for an ultrasonic handheld transducer |
US11464490B2 (en) * | 2017-11-14 | 2022-10-11 | Verathon Inc. | Real-time feedback and semantic-rich guidance on quality ultrasound image acquisition |
WO2020044770A1 (en) * | 2018-08-27 | 2020-03-05 | 富士フイルム株式会社 | Ultrasonic diagnostic device and ultrasonic diagnostic device control method |
JP7201404B2 (en) * | 2018-11-15 | 2023-01-10 | キヤノンメディカルシステムズ株式会社 | MEDICAL IMAGE PROCESSING APPARATUS, MEDICAL IMAGE PROCESSING METHOD, AND PROGRAM |
US10937545B2 (en) * | 2019-02-08 | 2021-03-02 | GE Precision Healthcare LLC | Method and system for centralized patient monitoring management |
CN110464462B (en) * | 2019-08-29 | 2020-12-25 | 中国科学技术大学 | Image navigation registration system for abdominal surgical intervention and related device |
CN111544036A (en) * | 2020-05-12 | 2020-08-18 | 上海深至信息科技有限公司 | Ultrasonic navigation system and method |
-
2021
- 2021-01-13 US US17/148,376 patent/US12133764B2/en active Active
- 2021-09-24 CN CN202111125620.4A patent/CN114246611B/en active Active
Patent Citations (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050119568A1 (en) * | 2003-10-14 | 2005-06-02 | Salcudean Septimiu E. | Method for imaging the mechanical properties of tissue |
US8831314B1 (en) | 2013-03-15 | 2014-09-09 | Heartflow, Inc. | Methods and systems for assessing image quality in modeling of patient anatomic or blood flow characteristics |
US10885151B2 (en) * | 2015-04-13 | 2021-01-05 | Johnson & Johnson Surgical Vision, Inc. | System and methods for a graphical user interface for conducting ophthalmic surgery |
CN105138250A (en) | 2015-08-03 | 2015-12-09 | 科大讯飞股份有限公司 | Human-computer interaction operation guide method, human-computer interaction operation guide system, human-computer interaction device and server |
US20170360402A1 (en) | 2016-06-20 | 2017-12-21 | Matthew de Jonge | Augmented reality interface for assisting a user to operate an ultrasound device |
US20170360401A1 (en) | 2016-06-20 | 2017-12-21 | Alex Rothberg | Automated image acquisition for assisting a user to operate an ultrasound device |
US10628001B2 (en) | 2017-06-16 | 2020-04-21 | General Electric Company | Adapting user interfaces based on gold standards |
US20180365025A1 (en) | 2017-06-16 | 2018-12-20 | General Electric Company | Systems and methods for adaptive user interfaces |
US20190206562A1 (en) * | 2017-12-28 | 2019-07-04 | Ethicon Llc | Method of hub communication, processing, display, and cloud analytics |
US20210082567A1 (en) * | 2018-01-22 | 2021-03-18 | Vuno, Inc. | Method for supporting viewing of images and apparatus using same |
US20200113542A1 (en) | 2018-10-16 | 2020-04-16 | General Electric Company | Methods and system for detecting medical imaging scan planes using probe position feedback |
CN111053573A (en) | 2018-10-16 | 2020-04-24 | 通用电气公司 | Method and system for detecting medical imaging scan planes using probe position feedback |
US20200160511A1 (en) * | 2018-11-21 | 2020-05-21 | Enlitic, Inc. | Medical scan artifact detection system |
US20200372714A1 (en) * | 2019-05-21 | 2020-11-26 | At&T Intellectual Property I, L.P. | Augmented reality medical diagnostic projection |
US20200397511A1 (en) * | 2019-06-18 | 2020-12-24 | Medtronic, Inc. | Ultrasound image-based guidance of medical instruments or devices |
US20200402651A1 (en) * | 2019-06-24 | 2020-12-24 | Carefusion 303, Inc. | Adaptive control of medical devices based on clinician interactions |
US20230181148A1 (en) * | 2020-04-23 | 2023-06-15 | Koninklijke Philips N.V. | Vascular system visualization |
US20230210491A1 (en) * | 2020-06-11 | 2023-07-06 | Koninklijke Philips N.V. | Method for estimating hemodynamic parameters |
Non-Patent Citations (8)
Title |
---|
"Augmented Reality Acquisition software with the Butterfly iQ," YouTube Website, Available Online at https://www.youtube.com/watch?v=dIIOTFyKMVU, Oct. 28, 2017, 4 pages. |
Aminlari, A. et al., "A Case of COVID-19 Diagnosed at Home With Portable Ultrasounds and Confirmed With Home Serology Test," The Journal of Emergency Medicine, vol. 60, No. 3, Oct. 12, 2020, 3 pages. |
CN application 202111125620.4 filed Sep. 24, 2021—Office Action issued Aug. 22, 2023; 14 pages. |
CN105138250 English Abstract; Espacenet search Nov. 22, 2023; 1 page. |
Kramers, M. K. (2014). Evaluating human performance for image-guided surgical tasks (Order No. 29242526). Available from ProQuest Dissertations and Theses Professional. (2701128441). Retrieved from https://dialog.proquest.com/professional/docview/2701128441?accountid=131444 (Year: 2014). * |
Pivetta, E. et al., "Self-Performed Lung Ultrasound for Home Monitoring of a Patient Positive for Coronavirus Disease 2019," CHEST Journal, vol. 158, No. 3, Sep. 1, 2020, 5 pages. |
Sultan, L. et al., "A Review of Early Experience in Lung Ultrasound in the Diagnosis and Management of COVID-19," Ultrasound in Medicine and Biology, vol. 46, No. 9, Sep. 2020, 17 pages. |
Zeldovich, L., "Handheld Ultrasound Devices are Speeding Diagnosis of COVID-19," Scientific American Website, Available Online at https://www.scientificamerican.com/article/handheld-ultrasound-devices-are-speeding-diagnosis-of-covid-19/, Jun. 11, 2020, 8 pages. |
Also Published As
Publication number | Publication date |
---|---|
CN114246611A (en) | 2022-03-29 |
US20220087644A1 (en) | 2022-03-24 |
CN114246611B (en) | 2024-06-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11488298B2 (en) | System and methods for ultrasound image quality determination | |
US12133764B2 (en) | Systems and methods for an adaptive interface for an ultrasound imaging system | |
US11593933B2 (en) | Systems and methods for ultrasound image quality determination | |
US20140153358A1 (en) | Medical imaging system and method for providing imaging assitance | |
JP2012081257A (en) | Ultrasonic diagnostic apparatus and ultrasonic image processing apparatus | |
US20160038125A1 (en) | Guided semiautomatic alignment of ultrasound volumes | |
US20160081659A1 (en) | Method and system for selecting an examination workflow | |
CN104757993A (en) | Method and medical imaging apparatus for displaying medical images | |
US11896434B2 (en) | Systems and methods for frame indexing and image review | |
US12020806B2 (en) | Methods and systems for detecting abnormalities in medical images | |
US11250564B2 (en) | Methods and systems for automatic measurement of strains and strain-ratio calculation for sonoelastography | |
US12056871B2 (en) | Method and system for managing image quality utilizing a generative model | |
US11890142B2 (en) | System and methods for automatic lesion characterization | |
US20220211353A1 (en) | Ultrasonic image display system and program for color doppler imaging | |
EP4082441B1 (en) | Ultrasound diagnostic apparatus and method for operating same | |
US11627941B2 (en) | Methods and systems for detecting pleural irregularities in medical images | |
CN113951922A (en) | Ultrasonic imaging equipment and scanning prompting method thereof | |
WO2016105972A1 (en) | Report generation in medical imaging | |
US12004900B2 (en) | System and methods for a measurement tool for medical imaging | |
US20230267618A1 (en) | Systems and methods for automated ultrasound examination | |
US20240225612A1 (en) | Ultrasonic imaging apparatus and control method thereof | |
US20240268792A1 (en) | Systems and Methods for User-Assisted Acquisition of Ultrasound Images | |
CN114983468B (en) | Imaging system and method using real-time inspection integrity monitor | |
US20220354453A1 (en) | Ultrasonic image diagnostic apparatus, identifier changing method, and identifier changing program | |
JP2022054644A (en) | Ultrasonic observation device, endoscope auxiliary device, operation method of ultrasonic observation device and operation program of ultrasonic observation device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: GE PRECISION HEALTHCARE LLC, WISCONSIN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:PATIL, MAHENDRA MADHUKAR;HANDA, ASHISH;SREENIVASAIAH, RAVEESH REDDY;SIGNING DATES FROM 20200914 TO 20200915;REEL/FRAME:054911/0060 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |