US20020171628A1 - Interactive control system having plural displays, and a method thereof - Google Patents

Interactive control system having plural displays, and a method thereof Download PDF

Info

Publication number
US20020171628A1
US20020171628A1 US10/192,726 US19272602A US2002171628A1 US 20020171628 A1 US20020171628 A1 US 20020171628A1 US 19272602 A US19272602 A US 19272602A US 2002171628 A1 US2002171628 A1 US 2002171628A1
Authority
US
United States
Prior art keywords
display
computer
operator
hand
displayed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/192,726
Inventor
Masayuki Tani
Kimiya Yamaashi
Koichiro Tanikoshi
Masato Horita
Masayasu Futakawa
Harumi Uchigasaki
Atsuhiko Nishikawa
Atsuhiko Hirota
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Ltd
Original Assignee
Hitachi Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Ltd filed Critical Hitachi Ltd
Priority to US10/192,726 priority Critical patent/US20020171628A1/en
Publication of US20020171628A1 publication Critical patent/US20020171628A1/en
Priority to US10/692,808 priority patent/US7057602B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/23Pc programming
    • G05B2219/23125Switch display to show different things, test or normal state
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/23Pc programming
    • G05B2219/23131Select on large display part of pictogram to show on display of used workstation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/23Pc programming
    • G05B2219/23152Large and several smaller displays for each workstation, each own cursor on large display
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/23Pc programming
    • G05B2219/23157Display process, synoptic, legend, pictogram, mimic
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10TECHNICAL SUBJECTS COVERED BY FORMER USPC
    • Y10STECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10S715/00Data processing: presentation processing of document, operator interface processing, and screen saver display processing
    • Y10S715/978Audio interaction as part of an operator interface

Definitions

  • the present invention relates to an interactive processing apparatus for interactive processing of plural displays, and a method thereof.
  • the operators monitor using both the large display and their display at hand.
  • the operators grasp entire system states by watching the overview information on the large display, and when an abnormal condition is detected, they examine more detailed data by using their own display at hand and perform necessary control operation.
  • One of the objects of the present invention is to provide a man-machine interface which is capable of referring related detailed information just by designating an objective on the large display.
  • a man-machine interface such as the one wherein detailed information on a warning and control data related to it are displayed in a display at hand just by pointing to a blinking warning on the large display, and control data and setting devices related to an apparatus are displayed on a display at hand only by pointing to the apparatus in a system configuration map on the large display.
  • the large display is shared by a plurality of operators.
  • a monitoring and controlling system is operated by collaboration of plural operators, each of them is in charge of a different operation respectively, such as an operator in charge of operation, an operator in charge of maintenance and inspection, and a chief on duty for controlling total operation.
  • the large display is shared by operators who perform different tasks simultaneously, which is different from a case of display at hand which is prepared for individual operators. Therefore, the above described interface must satisfy the following requirements:
  • Necessary information differs depending on contents of the charged task. For example, when a warning light indicating an abnormal condition of a boiler blinks, an operator in charge of operation examines control data such as a flow rate of fuel, while an operator in charge of maintenance examines an inspection record of the boiler. Accordingly, it is necessary for operators to be able to quickly retrieve information necessary for them without being distracted by information for others.
  • Commands used frequently and permission for operation differ depending on the task charged to respective operators. Accordingly, it is desirable that the operating environment such as a structure of menu and an operable range of operation can be customized for respective operators.
  • the object of the present invention is to provide a man-machine interface which satisfies the above requirements.
  • the above described objects can be realized by providing a registering means for registering an attribute of a respective operator to an input means, a process selecting means for selecting process contents based on the attribute responding to a process request from the input means, and an executing means for executing a process selected by the process selecting means and outputting to an output means selected based on the attribute, to an interactive processing apparatus having a plurality of input means and a plurality of output means.
  • An operator registers his own attribute, for example, charged task, etc, to his operating input means using the registering means.
  • the process selecting means examines the operator's attribute which has been registered in the input means, and selects a process corresponding to the attribute.
  • the executing means executes the process selected by the process selecting means, and outputs a result of the execution to an output device matched to the attribute, for example, a display at hand of the operator.
  • the operator can execute a necessary process without disturbing other operators' operation by selecting an output device based on the operator's attribute.
  • FIG. 1 is a schematic total overview indicating structure of the plant monitoring and controlling system 91 in accordance with the present invention.
  • FIG. 2 is an example of image manner displayed on a large display 1 .
  • FIG. 3 is an example of image manner displayed in a display at hand 10 .
  • FIG. 4 is an example of moving manner of a pointer between the display at hand 10 and the large display 1 .
  • FIG. 5 is an example of moving manner of a pointer between the display at hand 10 and the large display 1 .
  • FIG. 6 is an example of moving manner of a pointer between the display at hand 10 and the large display 1 .
  • FIG. 7 is an example of image manner at registering charged task.
  • FIG. 8 is a problem analysis diagram (PAD) indicating steps for registering charged task.
  • PAD problem analysis diagram
  • FIG. 9 is a drawing indicating a corresponding table of input device identification (ID) and registered charged task.
  • FIG. 10 is a drawing indicating a corresponding table of registered charged task and output device identification (ID).
  • FIG. 14 is an example of image manner displayed on the display at hand 10 .
  • FIG. 15 is an example of image manner displayed on the large display 1 and the display at hand 10 .
  • FIG. 16 is a drawing indicating a corresponding table of input events and executing process.
  • FIG. 17 is an example of a format for designating an output device.
  • FIG. 18 is a flow chart indicating a process flow at a pointing.
  • FIG. 19 is a drawing indicating a realizing method for pointing on the large display 1 and the display at hand.
  • FIG. 20 is a problem analysis diagram indicating a process flow of a method for realizing pointing on the display at hand 10 and the large display 1 .
  • FIG. 21 is an example of moving manner of a pointer between the display at hand 10 and the large display 1 .
  • FIG. 22 is a schematic drawing indicating an example of a system structure of the present invention.
  • FIG. 23 is a schematic drawing indicating another embodiment of the present invention.
  • FIG. 1 indicates a total structure of the plant monitoring and controlling system 91 which is one of the embodiments of the present invention.
  • the numeral 1 indicates a large display whereon overview information on a whole plant (system diagram, main warnings, important control data, main monitoring video image, etc.) is displayed.
  • the display on the large display 1 is performed by a workstation 2 .
  • Each of the displays 10 , 30 , 50 is placed at hand of a respective operator who is engaged in operation of the plant.
  • the displays 10 , 30 , 50 are called in general as displays at hand.
  • the operators grasp status of the whole plant by watching the overview information displayed on the large display 1 , examine detailed data using the individual display at hand if an abnormal symptom is found, and perform a setting operation, if necessary.
  • the displays in the respective displays 10 , 30 , 50 are performed by workstations 11 , 31 , and 51 , respectively.
  • Mice 12 , 32 , 52 , key boards 13 , 33 , 53 , and headsets 14 , 34 , 54 are connected to the workstations 11 , 31 , and 51 , respectively.
  • Operators point a position in the displays at hand 10 , 30 , 50 and on the large display 1 using the mice, respectively.
  • the headset is a headphone having a microphone.
  • the workstations 2 , 11 , 31 , 51 are mutually connected via a local area network 90 , and mutual information exchange is possible.
  • various computers for controlling, controllers for apparatus (not shown in the drawing) are connected directly or indirectly via other networks, the workstations 2 , 11 , 31 , 51 are accessible to various control information on the plant through the local area network 90 .
  • FIG. 1 three displays at hand 10 , 30 50 are illustrated. However, the number of the displays can be changed naturally depending on the number of the operators.
  • FIG. 1 indicates one large display, a plurality of the large displays can also be used.
  • the large display composed by joining a plurality of displays in a seamless manner may be used.
  • providing a speaker and a microphone instead of the headphone to each of the operators may be acceptable.
  • FIG. 22 A concept of the present invention is explained referring to FIG. 22.
  • the workstation 200 and the workstation 250 are connected to the network 230 , and the two workstations can exchange information arbitrarily with each other.
  • the workstation 250 To the workstation 250 , other workstations having the same structure as the workstation 250 are connected (not shown in FIG. 22).
  • the process executing part 201 of the workstation 200 outputs information to output apparatus of the other workstations connected to the network 230 as well as displaying information in the output apparatus 205 via the output processing part 202 .
  • the attribute registration processing part 257 displays a menu for the attribute registration in the output apparatus 256 via the input/output processing part 252 .
  • the attribute registration processing part 257 registers the selected attribute to the attribute memory part 253 , and further, stores a corresponding relationship between the selected attribute and the output apparatus 256 to the attribute memory part 203 of the workstation 200 via the network 230 .
  • Input information from the input apparatus 255 is transmitted to the process executing part 251 via the input/output processing part 252 .
  • the process executing part 251 executes a responding process and outputs results of the execution to the output apparatus 256 when the input information is such to a designate a position in a display of the output apparatus 256 .
  • the process executing part 251 transmits the input information with an attribute called out from the attribute memory part 253 to the process executing part 201 of the workstation 200 .
  • the process executing part 201 executes a process corresponding to the transmitted input information and the attribute.
  • the process executing part 201 transmits results of the execution to an output apparatus corresponding to the transmitted attribute which is called out from the attribute memory part 203 .
  • FIG. 2 illustrates an example of display manner on a large display 1 .
  • overview information such as a system diagram 5 and warnings 4 on the plant is displayed.
  • a warning 4 related to the abnormal condition blinks.
  • the abnormal condition is serious, warning sound alarms from the headsets 14 , 34 , 54 in addition to the blinking of the warnings 4 .
  • the numeral 3 designates a display controlling menu for controlling the display on the large display 1 .
  • the large display can display various images such as (1) prior displayed objects, (2) subsequent display objects, (3) weather information, (4) monitoring video image, (5) various system diagrams, etc.
  • Each of pointers 15 , 35 , 55 works with each of the mice respectively, and/or is colored with different colors so as to facilitate identification.
  • the pointers 15 , 35 , 55 naturally may have shapes different from one another, or may be added with information on operator's attribute, such as name of work, or personal name, instead of color coding.
  • the pointers 15 , 35 , 55 can be transferred continuously between the displays at hand 10 , 30 , 50 and the large display 1 , respectively. The transferring of the pointers will be explained in detail referring to FIGS. 4 - 6 later.
  • FIG. 3 illustrates an example of display manner on the display at hand 10 .
  • an embodiment is explained taking the display at hand 10 as an example, but the explanation can be applied to other displays at hand 30 , 50 if any exception is not mentioned especially.
  • FIG. 3 indicates an example of detailed information on the plant displayed on the display at hand 10 .
  • a plate hanger icon 17 and a person in charge icon 16 are displayed.
  • the plate hanger icon 17 provides a function to put a memorandum by voice to the display at hand 10 and the large display 1 .
  • the person in charge icon 16 provides a function to register charged task (operator, inspector, chief on duty, etc.) of an operator who uses input/output apparatus at hand, such as display at hand 10 , a mouse 12 , a keyboard 13 , and a headset 14 .
  • the registered charged task that means the charged task to the operator who uses the display 10 at the time, is displayed.
  • “operator” is displayed on the icon 16 . It means an operator in charge of operation is registered.
  • the pointer moves continuously from the display at hand to the large display 1 only by moving the mouse 12 forward as shown in FIG. 4. That means, under a condition wherein the pointer located on the display at hand 10 , the pointer 15 moves toward upper portion of the display at hand 10 in accordance with moving the mouse 12 forward, and reaches finally at the uppermost point of the display at hand 10 . If moving the mouse 12 forward further, the pointer 15 transfers to the lowest point of the large display 1 , and the pointer 15 moves upward to the top of the large display in accordance with further moving the mouse 12 forward.
  • FIG. 5 illustrates a case when the large display is composed of two displays 6 , 7 .
  • X or x can be obtained by replacing X with (X1+d), and H with 2d.
  • FIG. 4 a method for transferring the pointer 15 as if the top portion of the display at hand 10 joins the total span of the lower portion of the large display 10 has been explained.
  • a method for transferring the pointer 15 as if the top portion of the display at hand 10 joins the partial span of the lower portion of the large display can also be useful.
  • a range of the partial lower span of the large display to be joined to the display at hand 10 can be decided in consideration of a relative relationship of an arrangement between the large display 1 and the display at hand 10 .
  • X and H are substituted by (X ⁇ d1) and (H ⁇ d1 ⁇ d2), respectively, so that the pointer 15 transfers within a range d 1 the at left side of the lower portion of the large display.
  • d 2 is a distance from the right side of the large display 1 to right side of the display at hand 10 when the right side of the display at hand 10 is arranged at the right side of the large display 1 .
  • the pointer 15 is arranged so as to transfer at a right side range of the bottom of the large display 1 .
  • the image size (the number of pixels) of the pointer 15 may be changed in the large display 1 from that in the display at hand 10 . Especially, when the display at hand 10 and the large display are installed far apart. Making the display size of the pointer 15 larger facilitates identification. For instance, the pointer 15 is displayed with pixels 16.times.16 on the display at hand 10 , and with pixels 36.times.36 in the large display 1 .
  • the pointer becomes easily recognizable even in the far away large display.
  • a position on the display at hand 10 and the large display 1 can be designated continuously without changing grip of a pointing device
  • an operator registers his own charged task to the system.
  • the system provides service based on the registered charged task.
  • the service includes, for example, arranging a suitable operation environment for the charged task, facilitating to retrieve information only necessary for the charged task, and setting a permission for operation for each of the charged task.
  • the operation environment means items in a menu, an order of its arrangement, setting of default, and setting a permission for operation, etc.
  • FIG. 7 is an example of image manner at registering charged task.
  • FIG. 8 is a problem analysis diagram (PAD) indicating steps for registering charged task.
  • PID problem analysis diagram
  • Operation Selected when a service for a person in charge of operation is desirable.
  • Inspection Selected when a service for a person in charge of inspection is desirable.
  • Chief on duty Selected when a service for a person in charge of chief on duty is desirable.
  • Supervisor Selected when a service for a person responsible for all task is desirable. This item is selected, for example, at adjusting the system, or operating the system by only one person.
  • General Selected for a service within a range of task which does not cause serious disturbance to the system even though erroneous operation is executed. This item is selected, for example, when the system is operated by a person who is not familiar with the system for on job training.
  • a password input region 19 is displayed.
  • the charged task for the operator is registered to the system, and the registered charged task is displayed in the charged task icon 16 .
  • the same procedure as that of the registering (FIG. 8) is performed.
  • a correspondence of the charged task for the operator and input/output devices used by the operator is controlled using tables shown in FIGS. 9 and 10.
  • a table 120 indicating correspondence between input device IDs, such as mice 12 , 32 , 52 , keyboards 13 , 33 , 53 , and headsets 14 , 34 , 54 , and the charged task for the operator who uses the input devices is utilized for retrieving the charged task for the operator using the input devices with the input device IDs as keys.
  • a table 121 indicating correspondence between the charged task for the operator and output device IDs, such as displays at hand 10 , 30 , 50 , and headsets 14 , 34 , 54 , used by the operator is utilized for retrieving the output devices used by the operator by using the charged task as keys.
  • FIG. 11 illustrates a status wherein the display at hand 10 is used by an operator in charge of operation, and the display at hand 30 is used by an operator in charge of inspection.
  • monitoring video image of the boiler in the plant site is displayed on the display at hand 30 , and the condition of the boiler at the plant site can be inspected.
  • FIG. 11 shows an example of displaying information 22 in the displays at hand 10 , 30 , 50 responding to pointing on the large display 1 .
  • the information can be output in sound. Even in a case when the information is output in sound, the information is output to only a person who needs the information. For instance, when the operator in charge of operation points out a display on the large display 1 by the mouse 12 at hand, the information related to the display is output to the headset 14 provided to the operator in charge of operation in sound. Furthermore, not only information, but also sound feed back to the operation on the large display 1 is output to only the operator.
  • the feed back is output to only the headset 14 for the operator who has pointed, but not to the headsets for the other operators. That means, when the operator in charge of operation points a display on the large display 1 by the mouse 12 at hand, a sound signal is output to the headset 14 provided to the operator in charge of operation.
  • An error message to an erroneous operation on the large display 1 is also output to the display at hand 10 or the headset 14 only for the operator who has operated.
  • the error message which must be referred to other operators is output to the other operators.
  • information related to the displayed information on the large display 1 can be referred easily by pointing out the display by the mouse 12 at hand. And, the information is output only to the output device for the operator who has pointed out the display, and consequently, the operation does not distract other operators.
  • the large display is used commonly with many operators. Therefore, if information necessary for only a specified operator is displayed on the large display 1 , it may hide information which has been watched by other operators. In a case when a sound is output, if the sound is output loudly so as to reach all operators, it may distract operators who do not need the information. Furthermore, by displaying only information selected so as to correspond to the charged task for the operator who has pointed out, the operator can easily access the information necessary for only himself without being distracted by information to other operators.
  • the large display 1 is used commonly by a plurality of operators who are in charge of different tasks. Because a suitable operation environment for performing each of the tasks differs, the plant monitoring and controlling system 91 provides an operation environment corresponding to the charged task for each of the operators who use the large display 1 for interactive operation.
  • FIG. 12 illustrates an example of changing an arranging order of the menu items corresponding to the charged task of the operator.
  • the numeral 22 indicates a menu displayed when the operator in charge of operation must point out any one of symbols in the system diagram 5 displayed on the large display 1 by the mouse 12 .
  • any one of the information related to the pointed symbol such as data setting, monitoring video image, and inspection record, is displayed on the display at hand 10 .
  • the items in the menu 22 are arranged from the top to the bottom in an order based on frequency of selection by the operator in charge of operation.
  • the menu 42 is displayed by pointing out any one of the symbols in the system diagram 5 by the inspector.
  • the items in the menu 42 are the same as those of the menu 22 , but the items are arranged from the top to the bottom in an order based on frequency of selection by the inspector, that is, an order of monitoring video image, inspection record, and data setting.
  • FIG. 13 illustrates an example of changing operation permission based on the charged task.
  • Permission to operate the display control icon 3 for controlling display contents on the large display 1 is given only to the chief on duty.
  • the display control menu 6 is displayed.
  • the chief on duty can change the contents of the display on the large display 1 by selecting the display control menu 6 .
  • the operator in charge of operation, or the operator in charge of inspection points out items in the display control icon 3 and display control menu 6 , the pointing is neglected.
  • the plate hanger by voice so called here, means a memorandum by voice hung over the display on the large display 1 or the displays at hand 10 , 30 , 50 .
  • the plate hanger menu 23 is displayed.
  • the icon 24 is displayed on the display at hand 10 .
  • RECORD of the plate hanger menu 23 , voice transmitted from a microphone of the headset 14 is recorded.
  • the recording is stopped and the display of the plate hanger menu 23 is erased.
  • the recorded voice can be regenerated by clicking the icon 24 by operating, for example, a right button of the mouse 12 .
  • the voice is regenerated at the headset of the operator who has made the clicking. For instance, when the operator in charge of operation clicks the icon 24 by the mouse 12 at hand, the recorded voice is output to the headset 14 . On the other hand, when the operator in charge of inspection clicks the icon 24 by the mouse 32 at hand, the recorded voice is regenerated at the headset 34 .
  • the icon 24 can be placed at an arbitrary position of the display at hand 10 and the large display 1 by dragging.
  • the dragging of the icon 24 can be performed by moving the pointer 15 on the icon 24 and subsequent moving of the mouse 12 with pushing, for example, a left button of the mouse 12 .
  • FIG. 15 illustrates a moving manner of the icon 24 to the boiler in the system diagram 5 on the large display 1 by dragging.
  • FIGS. 16 - 20 a method for realizing the system 19 is explained.
  • a program for realizing the system 19 can be composed so as to be executed in any one of the workstations 2 , 11 , 31 , 51 , or in any several or all of the workstations 2 , 11 , 31 , 51 .
  • the event can be divided into three categories, such as kind of operation, button number, and person in charge.
  • the kind of operation includes the following items, and designates a kind of event.
  • the button number designates the button or the key which has generated the event.
  • the person in charge designates charged task of the operator who has generated the event.
  • the executing process can be divided into two categories such as routine and output.
  • the routine stores a process to be executed when an event is generated, and output designates an output apparatus which is used by the operator who must receive the output.
  • the above designation of the operator is performed by designating a kind of task charged to the operator. That means, when an operator in charge of operation is designated as a destination of an output, the output is transferred to the output apparatus which is used by the operator in charge of operation.
  • FIG. 17 illustrates a format 131 for designating an output destination.
  • Each of the bits in the format 131 corresponds to a respective charged task.
  • a bit corresponding to a person in charge to receive the output is designated as “1”, and a bit corresponding to a person in charge not to receive the output is designated as “0”.
  • the second bit and the third bit in the format 131 are designated as “1”, and other bits are “0”.
  • step 140 input event cues of the workstations 11 , 31 , 51 are examined. If the event is stored in the input event queue, the event is taken out.
  • the event includes information such as an input device ID which generates the event, a button number which generates the event, and a location where the event is generated.
  • a table 120 (FIG. 9) is searched using the input device ID of the taken out event as a key to retrieve charged task of the operator who has generated the event.
  • step 142 a displayed object in the location where the event is generated is searched based on the event generated location.
  • step 143 If no displayed object exists at the event generated location (step 143 ), the operation returns to the step 140 and continues to process the next input event. If any displayed object exists at the event generated location (step 143 ), the operation goes to step 144 .
  • step 144 the input event items in the corresponding table 130 of the event/executing process for the displayed object which is searched in step 142 is examined whether any input items are matched with kind of operation, button number, and person in charge of the input event. If there are any event items matched with the input event (step 145 ), an output destination of corresponding executing process items is taken out, the table 121 (FIG.
  • step 145 When no event item matching to the input event is found in step 144 (step 145 ), the operation returns to the step 142 , and other object at the event generated location is searched. The above described processing is repeated until the plant monitoring and controlling system 91 is ended (step 148 ).
  • H is the number of pixels in a horizontal direction of the large display 1
  • V is the number of pixels in a vertical direction
  • h is the number of pixels in a horizontal direction of the display at hand 10
  • v is the number of pixels in a vertical direction: q is one pixel or a several pixels.
  • FIG. 20 a process flow of a method for realizing pointing by the mouse 12 on the display at hand 10 and the large display 1 is explained hereinafter.
  • an initial setting is q ⁇ cury ⁇ v
  • the pointer is displayed at a position (curx, cury) on the display at hand 10 (step 162 ).
  • cury>q step 160
  • the event processing is executed to the displayed object on the display at hand 10 (step 163 ).
  • the mouse 12 is moved forward and cury becomes less than q, that is cury ⁇ q, the pointer transfers on the large display 1 .
  • the event processing is executed to the displayed object on the large display 1 (step 168 ).
  • the charged task for the operator is registered first. And, by controlling the corresponding relationship between the registered charged task and input/output device, information corresponding to the charged task is displayed and operation environment is set.
  • any attribute of the operator other than the charged task can be usable. For instance, name, age, order, class, rank, sex, mother language, skillfulness can be used registration for the control. Further, not only one attribute, a several attributes connected by logical equations can be used for the registration.
  • the service can be provided with contents matched with various attribute of the operator.
  • a method to register the attribute of the operator by selecting the menu is used.
  • the attribute of the operator may be recognized by the plant monitoring and controlling system 91 itself.
  • the operator sitting in front of the display 10 may be recognized by the operator's face, or by the operator's voiceprint input from a microphone.
  • the attribute of the operator is registered at the beginning of the operation.
  • the attribute of the operator may be asked (a menu for selecting the attribute is displayed), or a processing for recognizing the attribute may start, at a moment when the system needs to know the attribute of the operator.
  • a mouse is used for pointing on the large display 1
  • a laser beam pointer can also be used.
  • a pointing position on the large display 1 is determined by taking video with a video camera in front or back of the large display screen and processing the video image for determining position of the laser beam.
  • Recognition of the device ID when a plurality of laser pointers are used is performed by using the laser pointers each having a laser beam of different color, and determining the color of the laser beam.
  • infrared pointers can be used. In this case, devices can be recognized by using different frequencies of infrared each other.
  • the pointer 15 moves between the display at hand 10 and the large display 1 as if the upper side of the display at hand 10 and a whole or a part of the lower side of the large display 1 were connected.
  • the pointer 15 may be arranged so as to move from lateral side (left side or right side) of the display at hand 10 .
  • the moving manner of the pointer 15 between the display at hand 10 and the large display 1 may be set depending upon relative positions of the large display 1 and the display at hand 10 . Therefore, the operator can operates as if the large display 1 were located on an extended line of the display at hand 10 , and natural interface for the operator can be realized.
  • a conventional display apparatus is used as for the display at hand 10 .
  • a see-through display apparatus can be used as for the display at hand 10 .
  • the see-through display is a translucent display, and information displayed on it is visible with a background of the display in a superimposed manner.
  • see-through display apparatus there is a see-through head-mounted display described in a reference, Proceedings Of The ACM Symposium on User Interface Software And Technology, November, (1991) ACM Press pp. 9-17.
  • FIG. 23 the second embodiment using the see-through display of the present invention is explained hereinafter.
  • an operator A uses a see-through display 1100 and another operator B uses a see-through display 1200 .
  • On the large display 1000 only information which is shared with the operator A and B is displayed.
  • information necessary for only the operator A is displayed on the see-through display 1100
  • other information necessary for only the operator B is displayed on the see-through display 1200 .
  • a pointer 1110 which is moved by operating a mouse at hand of the operator A is displayed on the see-through display 1100
  • a pointer 1210 which is moved by operating a mouse at hand of the operator B is displayed on the see-through display 1200
  • a menu 1120 which is displayed when the operator A presses a button of the mouse is displayed on the see-through display 1100 .
  • the pointer, the menu, detailed information and others which are displayed on the see-through display are visible with displayed object on the large display in a superimposed manner. That means, the displayed object on the see-through display looks for the operator as if it were displayed on the large display. The operator can point the displayed object on the large display arbitrarily by the pointer displayed on the see-through display.
  • a relationship between display coordinates of the see-through displays 1100 , 1200 and of the large display 1000 is maintained constant. That means, in such a case as mounting a see-through display at the overhead of the operator, a 3D tracking system is used for tracking the position and orientation of the see-through display, and the display coordinates of the see-through display are corrected in connection with a relationship with a relative position of the large display 1000 .
  • a see-through display and a conventional display can be used together as for displays at hand. That means, information which is desired to be displayed in a superimposed manner with information displayed on the large display 1000 such as a pointer for designating a position on the large display 1000 , and a menu for operating a displayed object on the large display 1000 , are displayed on the see-through display, and other information which is not required to be seen in a superimposed manner with the displayed object on the large display 1000 may be displayed on the conventional display.
  • a process matching with an operator who has operated can be executed.
  • An attribute of the operator is registered corresponding to an input means which is used by the operator, and when the operator operates the input means, a process matching with the operator who has operated can be executed by examining the registered attribute of the operator and selecting a matched process with the attribute of the operator for executing.
  • An output destination can be selected corresponding to the operator who has operated so as not to distract other operators' task.
  • An attribute of the operator is registered corresponding to an input means which is used by the operator, and when the operator operates the input means, a result can be output without distracting other operators by examining the registered attribute of the operator and selecting an output destination of the result of the processing matched with the attribute of the operator.
  • An operating environment can be set matching with the operator.
  • An attribute of the operator is registered corresponding to an input means which is used by the operator, and when the operator operates the input means, an operating environment matching with the operator who has operated can be provided by examining the registered attribute of the operator and setting the matched operating environment with the attribute of the operator.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Digital Computer Display Output (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Information Transfer Between Computers (AREA)
  • Computer And Data Communications (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

An interactive computer system with plural displays includes a first computer, having an input device and an output including a display and a second computer having an output including display coupled to the first computer.
A voice recorder is provided to create an icon representing a recorded voice, and the icon is displayed on the display of the first computer. The input device of the first computer, which may be a mouse, is adapted to drag the icon from the display of the first computer to the display of the second computer, and to reproduce the recorded voice in response to a pointer displayed on the displays and controlled by the input device.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This is a continuation of application Ser. No. 09/619,647 filed Jul. 19, 2000, which is a continuation of application Ser. No. 09/374,263 filed Aug. 16, 1999, U.S. Pat. No. 6,100,857, which is a continuation of application Ser. No. 08/969,313 filed Nov. 13, 1997, U.S. Pat. No. 5,969,697, which is a continuation of application Ser. No. 08/230,369 filed Apr. 20, 1994, abandoned.[0001]
  • BACKGROUND OF THE INVENTION
  • (1) Field of the Invention [0002]
  • The present invention relates to an interactive processing apparatus for interactive processing of plural displays, and a method thereof. [0003]
  • (2) Description of the Prior Art [0004]
  • Current monitoring and controlling systems have a large display installed in front of operators in order to display overview information such as a system configuration map of a total system, alarms for indicating that something unusual is occurring, for allowing all the operators to grasp a condition of the system at a glance at any time. On the other hand, a display at hand prepared for respective operators displays integrally more detailed information. The amount of the detailed information displayed on each display at hand is numerous, and it is not rare to reach hundreds of images in a large scale system. [0005]
  • The operators monitor using both the large display and their display at hand. The operators grasp entire system states by watching the overview information on the large display, and when an abnormal condition is detected, they examine more detailed data by using their own display at hand and perform necessary control operation. [0006]
  • However, because information displayed on the large display and information shown in displays at hand are independently controlled, a conventional system required a complex operation to provide the necessary information to both in connection with each other. For instance, when a warning lamp blinks on the large display, the operators must retrieve an image displaying control data for the warning from hundreds of images by selecting menu repeatedly. Therefore, there has been a problem of a delayed response to an emergency such as an abnormal condition occurrence or an accident. [0007]
  • SUMMARY OF THE INVENTION
  • (1) Objects of the Invention: [0008]
  • One of the objects of the present invention is to provide a man-machine interface which is capable of referring related detailed information just by designating an objective on the large display. For instance, a man-machine interface such as the one wherein detailed information on a warning and control data related to it are displayed in a display at hand just by pointing to a blinking warning on the large display, and control data and setting devices related to an apparatus are displayed on a display at hand only by pointing to the apparatus in a system configuration map on the large display. [0009]
  • When realizing such a man-machine interface as the one above described, an important point to be considered is that the large display is shared by a plurality of operators. A monitoring and controlling system is operated by collaboration of plural operators, each of them is in charge of a different operation respectively, such as an operator in charge of operation, an operator in charge of maintenance and inspection, and a chief on duty for controlling total operation. Accordingly, the large display is shared by operators who perform different tasks simultaneously, which is different from a case of display at hand which is prepared for individual operators. Therefore, the above described interface must satisfy the following requirements: [0010]
  • (1) No Disturbance to Other Operator's Operation: [0011]
  • There is a possibility to hide information which has been watched by other operators when information necessary for only a specified operator is displayed arbitrarily on the large display. [0012]
  • (2) Simple Retrieval of Information Necessary for Individual Tasks by Respective Operators: [0013]
  • Necessary information differs depending on contents of the charged task. For example, when a warning light indicating an abnormal condition of a boiler blinks, an operator in charge of operation examines control data such as a flow rate of fuel, while an operator in charge of maintenance examines an inspection record of the boiler. Accordingly, it is necessary for operators to be able to quickly retrieve information necessary for them without being distracted by information for others. [0014]
  • (3) An Operating Environment Suitable for Tasks Assigned to each Operator: [0015]
  • Commands used frequently and permission for operation differ depending on the task charged to respective operators. Accordingly, it is desirable that the operating environment such as a structure of menu and an operable range of operation can be customized for respective operators. [0016]
  • The object of the present invention is to provide a man-machine interface which satisfies the above requirements. [0017]
  • (2) Methods of Solving the Problems: [0018]
  • In accordance with the present invention, the above described objects can be realized by providing a registering means for registering an attribute of a respective operator to an input means, a process selecting means for selecting process contents based on the attribute responding to a process request from the input means, and an executing means for executing a process selected by the process selecting means and outputting to an output means selected based on the attribute, to an interactive processing apparatus having a plurality of input means and a plurality of output means. [0019]
  • An operator registers his own attribute, for example, charged task, etc, to his operating input means using the registering means. When the operator requests a process for displaying related information and menu from the input means, the process selecting means examines the operator's attribute which has been registered in the input means, and selects a process corresponding to the attribute. The executing means executes the process selected by the process selecting means, and outputs a result of the execution to an output device matched to the attribute, for example, a display at hand of the operator. In accordance with the execution of the process based on the operator's attribute, displaying only necessary images for the operator and providing convenient operating environment for the operator to operate in, can be realized. Furthermore, the operator can execute a necessary process without disturbing other operators' operation by selecting an output device based on the operator's attribute.[0020]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a schematic total overview indicating structure of the plant monitoring and controlling [0021] system 91 in accordance with the present invention.
  • FIG. 2 is an example of image manner displayed on a [0022] large display 1.
  • FIG. 3 is an example of image manner displayed in a display at [0023] hand 10.
  • FIG. 4 is an example of moving manner of a pointer between the display at [0024] hand 10 and the large display 1.
  • FIG. 5 is an example of moving manner of a pointer between the display at [0025] hand 10 and the large display 1.
  • FIG. 6 is an example of moving manner of a pointer between the display at [0026] hand 10 and the large display 1.
  • FIG. 7 is an example of image manner at registering charged task. [0027]
  • FIG. 8 is a problem analysis diagram (PAD) indicating steps for registering charged task. [0028]
  • FIG. 9 is a drawing indicating a corresponding table of input device identification (ID) and registered charged task. [0029]
  • FIG. 10 is a drawing indicating a corresponding table of registered charged task and output device identification (ID). [0030]
  • FIG. 11 is an example of image manner displayed on the [0031] large display 1 and the displays at hand 10, 30.
  • FIG. 12 is an example of image manner displayed on the [0032] large display 1.
  • FIG. 13 is an example of image manner displayed on the [0033] large display 1.
  • FIG. 14 is an example of image manner displayed on the display at [0034] hand 10.
  • FIG. 15 is an example of image manner displayed on the [0035] large display 1 and the display at hand 10.
  • FIG. 16 is a drawing indicating a corresponding table of input events and executing process. [0036]
  • FIG. 17 is an example of a format for designating an output device. [0037]
  • FIG. 18 is a flow chart indicating a process flow at a pointing. [0038]
  • FIG. 19 is a drawing indicating a realizing method for pointing on the [0039] large display 1 and the display at hand.
  • FIG. 20 is a problem analysis diagram indicating a process flow of a method for realizing pointing on the display at [0040] hand 10 and the large display 1.
  • FIG. 21 is an example of moving manner of a pointer between the display at [0041] hand 10 and the large display 1.
  • FIG. 22 is a schematic drawing indicating an example of a system structure of the present invention. [0042]
  • FIG. 23 is a schematic drawing indicating another embodiment of the present invention. [0043]
  • DETAILED DESCRIPTION
  • Embodiments of the present invention are explained hereinafter with reference to the drawings. [0044]
  • FIG. 1 indicates a total structure of the plant monitoring and controlling [0045] system 91 which is one of the embodiments of the present invention. The numeral 1 indicates a large display whereon overview information on a whole plant (system diagram, main warnings, important control data, main monitoring video image, etc.) is displayed. The display on the large display 1 is performed by a workstation 2. Each of the displays 10, 30, 50 is placed at hand of a respective operator who is engaged in operation of the plant. Hereinafter, the displays 10, 30, 50 are called in general as displays at hand. The operators grasp status of the whole plant by watching the overview information displayed on the large display 1, examine detailed data using the individual display at hand if an abnormal symptom is found, and perform a setting operation, if necessary. The displays in the respective displays 10, 30, 50 are performed by workstations 11, 31, and 51, respectively. Mice 12, 32, 52, key boards 13, 33, 53, and headsets 14, 34, 54 are connected to the workstations 11, 31, and 51, respectively. Operators point a position in the displays at hand 10, 30, 50 and on the large display 1 using the mice, respectively. The headset is a headphone having a microphone. The operator hears sound output from the system and inputs sound signal to the system using the headset. Furthermore, the workstations 2, 11, 31, 51 are mutually connected via a local area network 90, and mutual information exchange is possible. To the local area network 90, various computers for controlling, controllers for apparatus (not shown in the drawing) are connected directly or indirectly via other networks, the workstations 2, 11, 31, 51 are accessible to various control information on the plant through the local area network 90.
  • In FIG. 1, three displays at [0046] hand 10, 30 50 are illustrated. However, the number of the displays can be changed naturally depending on the number of the operators. Although FIG. 1 indicates one large display, a plurality of the large displays can also be used. The large display composed by joining a plurality of displays in a seamless manner may be used. Furthermore, providing a speaker and a microphone instead of the headphone to each of the operators may be acceptable.
  • A concept of the present invention is explained referring to FIG. 22. In FIG. 22, the [0047] workstation 200 and the workstation 250 are connected to the network 230, and the two workstations can exchange information arbitrarily with each other. To the workstation 250, other workstations having the same structure as the workstation 250 are connected (not shown in FIG. 22). The process executing part 201 of the workstation 200 outputs information to output apparatus of the other workstations connected to the network 230 as well as displaying information in the output apparatus 205 via the output processing part 202.
  • To the [0048] workstation 250, an attribute is registered at the start of the operation. The attribute registration processing part 257 displays a menu for the attribute registration in the output apparatus 256 via the input/output processing part 252. When the menu is selected by the input apparatus 255, the attribute registration processing part 257 registers the selected attribute to the attribute memory part 253, and further, stores a corresponding relationship between the selected attribute and the output apparatus 256 to the attribute memory part 203 of the workstation 200 via the network 230.
  • Input information from the [0049] input apparatus 255 is transmitted to the process executing part 251 via the input/output processing part 252. The process executing part 251 executes a responding process and outputs results of the execution to the output apparatus 256 when the input information is such to a designate a position in a display of the output apparatus 256. On the other hand, when the input information is such as designating a position in a display of the output apparatus 205, the process executing part 251 transmits the input information with an attribute called out from the attribute memory part 253 to the process executing part 201 of the workstation 200. The process executing part 201 executes a process corresponding to the transmitted input information and the attribute. The process executing part 201 transmits results of the execution to an output apparatus corresponding to the transmitted attribute which is called out from the attribute memory part 203.
  • FIG. 2 illustrates an example of display manner on a [0050] large display 1. On the large display 1, overview information such as a system diagram 5 and warnings 4 on the plant is displayed. When an abnormal condition of the plant is detected, a warning 4 related to the abnormal condition blinks. Especially, when the abnormal condition is serious, warning sound alarms from the headsets 14, 34, 54 in addition to the blinking of the warnings 4. The numeral 3 designates a display controlling menu for controlling the display on the large display 1. By selecting the display controlling menu 3, the large display can display various images such as (1) prior displayed objects, (2) subsequent display objects, (3) weather information, (4) monitoring video image, (5) various system diagrams, etc. Each of pointers 15, 35, 55 works with each of the mice respectively, and/or is colored with different colors so as to facilitate identification. The pointers 15, 35, 55 naturally may have shapes different from one another, or may be added with information on operator's attribute, such as name of work, or personal name, instead of color coding. The pointers 15, 35, 55 can be transferred continuously between the displays at hand 10, 30, 50 and the large display 1, respectively. The transferring of the pointers will be explained in detail referring to FIGS. 4-6 later.
  • FIG. 3 illustrates an example of display manner on the display at [0051] hand 10. Hereinafter, an embodiment is explained taking the display at hand 10 as an example, but the explanation can be applied to other displays at hand 30, 50 if any exception is not mentioned especially. FIG. 3 indicates an example of detailed information on the plant displayed on the display at hand 10. On the display at hand 10, a plate hanger icon 17 and a person in charge icon 16 are displayed. The plate hanger icon 17 provides a function to put a memorandum by voice to the display at hand 10 and the large display 1. The person in charge icon 16 provides a function to register charged task (operator, inspector, chief on duty, etc.) of an operator who uses input/output apparatus at hand, such as display at hand 10, a mouse 12, a keyboard 13, and a headset 14. On the person in charge icon 16, the registered charged task, that means the charged task to the operator who uses the display 10 at the time, is displayed. In the case shown in FIG. 3, “operator” is displayed on the icon 16. It means an operator in charge of operation is registered.
  • Next, a method for transferring the [0052] pointer 15 between the large display 1 and the display at hand 10 is explained referring to FIGS. 4-6. In the present embodiment, the pointer moves continuously from the display at hand to the large display 1 only by moving the mouse 12 forward as shown in FIG. 4. That means, under a condition wherein the pointer located on the display at hand 10, the pointer 15 moves toward upper portion of the display at hand 10 in accordance with moving the mouse 12 forward, and reaches finally at the uppermost point of the display at hand 10. If moving the mouse 12 forward further, the pointer 15 transfers to the lowest point of the large display 1, and the pointer 15 moves upward to the top of the large display in accordance with further moving the mouse 12 forward. On the contrary, if the mouse 12 is moved backward under a condition wherein the pointer 15 is displayed on the large display 1, the pointer 15 moves downward to the lowest point of the large display 1. If moving the mouse 12 backward further, the pointer 15 transfers to the uppermost point of the display at hand 10. Display position of the pointer 15 at a moment when the pointer 15 transfers from the display at hand 10 to the large display 1 is decided as shown in FIG. 4. That is, when putting x for a horizontal position of the pointer 15 at the moment of transferring from the display at hand 10 to the large display 1, h for the number of pixels in a horizontal direction on the display at hand 10, X for a horizontal position of the pointer 15 at the moment of entering into the large display 1, and H for the number of pixels in a horizontal direction, the X is decided so as to be x:h=X:H. Similarly, when the pointer transfers from the large display 1 to the display at hand 10, x is decided so as to be x:h=X:H.
  • When the large display is composed of a plurality of displays, the display position of the [0053] pointer 15 is decided in the same manner as explained referring to FIG. 4 taking an assumption that the plurality of displays are joined in a seamless manner. FIG. 5 illustrates a case when the large display is composed of two displays 6, 7. When putting d for width of each display of the two displays, X or x can be obtained by replacing X with (X1+d), and H with 2d.
  • Referring to FIG. 4, a method for transferring the [0054] pointer 15 as if the top portion of the display at hand 10 joins the total span of the lower portion of the large display 10 has been explained. However, a method for transferring the pointer 15 as if the top portion of the display at hand 10 joins the partial span of the lower portion of the large display can also be useful. A range of the partial lower span of the large display to be joined to the display at hand 10 can be decided in consideration of a relative relationship of an arrangement between the large display 1 and the display at hand 10. That means, when the display at hand 10 is arranged at left side to the large display 1, X and H are substituted by (X−d1) and (H−d1−d2), respectively, so that the pointer 15 transfers within a range d1 the at left side of the lower portion of the large display. Here, d2 is a distance from the right side of the large display 1 to right side of the display at hand 10 when the right side of the display at hand 10 is arranged at the right side of the large display 1. On the contrary, when the display at hand 10 is arranged at the right side of the large display 1, the pointer 15 is arranged so as to transfer at a right side range of the bottom of the large display 1.
  • The image size (the number of pixels) of the [0055] pointer 15 may be changed in the large display 1 from that in the display at hand 10. Especially, when the display at hand 10 and the large display are installed far apart. Making the display size of the pointer 15 larger facilitates identification. For instance, the pointer 15 is displayed with pixels 16.times.16 on the display at hand 10, and with pixels 36.times.36 in the large display 1.
  • According to the above selection, the pointer becomes easily recognizable even in the far away large display. [0056]
  • Advantages of the above described method are as follows: [0057]
  • (1) a position on the display at [0058] hand 10 and the large display 1 can be designated continuously without changing grip of a pointing device;
  • (2) interactive operation of the large display can be performed with the same feeling as that of operation of the display at [0059] hand 10;
  • (3) depending on the above advantage (2), learning of operation is easy. [0060]
  • At the start of operation of a system in the present embodiment, an operator registers his own charged task to the system. The system provides service based on the registered charged task. The service includes, for example, arranging a suitable operation environment for the charged task, facilitating to retrieve information only necessary for the charged task, and setting a permission for operation for each of the charged task. Here, the operation environment means items in a menu, an order of its arrangement, setting of default, and setting a permission for operation, etc. [0061]
  • A method for registering the charged task is explained hereinafter with reference to FIGS. 7 and 8. FIG. 7 is an example of image manner at registering charged task. FIG. 8 is a problem analysis diagram (PAD) indicating steps for registering charged task. When the charged [0062] task icon 16 on the display at hand 10 is indicated by the mouse 12 (step 100), the charged task selecting menu 18 is displayed on the display at hand 10. Items in the charged task selecting menu are as follows:
  • Operation: Selected when a service for a person in charge of operation is desirable. [0063]
  • Inspection: Selected when a service for a person in charge of inspection is desirable. [0064]
  • Chief on duty: Selected when a service for a person in charge of chief on duty is desirable. [0065]
  • Supervisor: Selected when a service for a person responsible for all task is desirable. This item is selected, for example, at adjusting the system, or operating the system by only one person. [0066]
  • General: Selected for a service within a range of task which does not cause serious disturbance to the system even though erroneous operation is executed. This item is selected, for example, when the system is operated by a person who is not familiar with the system for on job training. [0067]
  • When a desired item in the charged [0068] task selecting menu 18 is selected by the mouse 12 (step 101), a password input region 19 is displayed. When a password which is designated to each of the charged task is input (step 102), the charged task for the operator is registered to the system, and the registered charged task is displayed in the charged task icon 16. When the charged task must be changed, the same procedure as that of the registering (FIG. 8) is performed.
  • In the system of the present embodiment, a correspondence of the charged task for the operator and input/output devices used by the operator is controlled using tables shown in FIGS. 9 and 10. In FIG. 9, a table [0069] 120 indicating correspondence between input device IDs, such as mice 12, 32, 52, keyboards 13, 33, 53, and headsets 14, 34, 54, and the charged task for the operator who uses the input devices is utilized for retrieving the charged task for the operator using the input devices with the input device IDs as keys.
  • In FIG. 10, a table [0070] 121 indicating correspondence between the charged task for the operator and output device IDs, such as displays at hand 10, 30, 50, and headsets 14, 34, 54, used by the operator is utilized for retrieving the output devices used by the operator by using the charged task as keys.
  • Referring to FIG. 11, an example of utilizing manner of the plant monitoring and controlling [0071] system 91 is explained hereinafter.
  • When an operator points out an image displayed on the [0072] large display 1 by a mouse at hand, detailed information related to the pointed image and necessary for the task charged to the operator is displayed in the operator's displays at hand 10, 30, 50. FIG. 11 illustrates a status wherein the display at hand 10 is used by an operator in charge of operation, and the display at hand 30 is used by an operator in charge of inspection. When the operator in charge of operation points out a boiler in a system diagram displayed on the large display 1 by the mouse 12, a display 20 for setting control data for the boiler is displayed on the display at hand 10, and an operation procedure of the boiler becomes operable. On the other hand, when the operator in charge of inspection points out the boiler in the same system diagram displayed similarly on the large display 1, monitoring video image of the boiler in the plant site is displayed on the display at hand 30, and the condition of the boiler at the plant site can be inspected.
  • FIG. 11 shows an example of displaying [0073] information 22 in the displays at hand 10, 30, 50 responding to pointing on the large display 1. However, the information can be output in sound. Even in a case when the information is output in sound, the information is output to only a person who needs the information. For instance, when the operator in charge of operation points out a display on the large display 1 by the mouse 12 at hand, the information related to the display is output to the headset 14 provided to the operator in charge of operation in sound. Furthermore, not only information, but also sound feed back to the operation on the large display 1 is output to only the operator. For instance, when a sound signal is fed back at every pointing, the feed back is output to only the headset 14 for the operator who has pointed, but not to the headsets for the other operators. That means, when the operator in charge of operation points a display on the large display 1 by the mouse 12 at hand, a sound signal is output to the headset 14 provided to the operator in charge of operation.
  • An error message to an erroneous operation on the [0074] large display 1 is also output to the display at hand 10 or the headset 14 only for the operator who has operated. Of course, the error message which must be referred to other operators is output to the other operators.
  • As explained above, information related to the displayed information on the [0075] large display 1 can be referred easily by pointing out the display by the mouse 12 at hand. And, the information is output only to the output device for the operator who has pointed out the display, and consequently, the operation does not distract other operators. The large display is used commonly with many operators. Therefore, if information necessary for only a specified operator is displayed on the large display 1, it may hide information which has been watched by other operators. In a case when a sound is output, if the sound is output loudly so as to reach all operators, it may distract operators who do not need the information. Furthermore, by displaying only information selected so as to correspond to the charged task for the operator who has pointed out, the operator can easily access the information necessary for only himself without being distracted by information to other operators.
  • The [0076] large display 1 is used commonly by a plurality of operators who are in charge of different tasks. Because a suitable operation environment for performing each of the tasks differs, the plant monitoring and controlling system 91 provides an operation environment corresponding to the charged task for each of the operators who use the large display 1 for interactive operation.
  • FIG. 12 illustrates an example of changing an arranging order of the menu items corresponding to the charged task of the operator. The numeral [0077] 22 indicates a menu displayed when the operator in charge of operation must point out any one of symbols in the system diagram 5 displayed on the large display 1 by the mouse 12. By selecting the menu 22, any one of the information related to the pointed symbol such as data setting, monitoring video image, and inspection record, is displayed on the display at hand 10. Here, the items in the menu 22 are arranged from the top to the bottom in an order based on frequency of selection by the operator in charge of operation. On the other hand, the menu 42 is displayed by pointing out any one of the symbols in the system diagram 5 by the inspector. The items in the menu 42 are the same as those of the menu 22, but the items are arranged from the top to the bottom in an order based on frequency of selection by the inspector, that is, an order of monitoring video image, inspection record, and data setting.
  • FIG. 13 illustrates an example of changing operation permission based on the charged task. Permission to operate the [0078] display control icon 3 for controlling display contents on the large display 1 is given only to the chief on duty. When the chief on duty who has registered his charged task to the display at hand 10 points out the display control icon 3 by the mouse 12 at hand, the display control menu 6 is displayed. The chief on duty can change the contents of the display on the large display 1 by selecting the display control menu 6. However, even if the operator in charge of operation, or the operator in charge of inspection points out items in the display control icon 3 and display control menu 6, the pointing is neglected.
  • In the above description, a case when some information is output to the operator corresponding to the task of the operator is explained. However, there may be cases wherein information, such as warning, is output to the operator by the system voluntarily. Even in this case, the information is output only to the operator in charge of the task which needs the information. For example, warning sound to generate warning which can be treated only by the operator in charge of operation is output only to the [0079] headset 14 of the operator in charge of operation.
  • In a case when the [0080] large display 1 is too large to be within the operator's visual field, there may be a possibility that the operator fails to be aware of information which is displayed out of his visual field. To prevent such a case from occurring, a sound is output simultaneously with the display on the large display 1 so as to indicate the display position. The operator becomes aware of displaying new information by the sound without watching the large display 1. Further, because the sound is output so as to indicate the display position, the operator can be aware of the approximate position of the displayed information. When the sound is output simultaneously with the display, the sound is output to only the operator in charge of the task which requires the displayed information. For instance, when information relating to operation is displayed, the sound is output to only the headset 14 for the operator in charge of operation.
  • Referring to FIGS. 14 and 15, a plate hanger by voice is explained hereinafter. The plate hanger by voice, so called here, means a memorandum by voice hung over the display on the [0081] large display 1 or the displays at hand 10, 30, 50. When pointing the plate hanger icon 17 by the mouse 12 as shown in FIG. 14, the plate hanger menu 23 is displayed. By selecting an item, GENERATE, of the plate hanger menu 23, the icon 24 is displayed on the display at hand 10. Subsequently, by selecting an item, RECORD, of the plate hanger menu 23, voice transmitted from a microphone of the headset 14 is recorded. By selecting an item, END, of the plate hanger menu 23 after finishing the input of the voice, the recording is stopped and the display of the plate hanger menu 23 is erased. The recorded voice can be regenerated by clicking the icon 24 by operating, for example, a right button of the mouse 12. The voice is regenerated at the headset of the operator who has made the clicking. For instance, when the operator in charge of operation clicks the icon 24 by the mouse 12 at hand, the recorded voice is output to the headset 14. On the other hand, when the operator in charge of inspection clicks the icon 24 by the mouse 32 at hand, the recorded voice is regenerated at the headset 34.
  • The [0082] icon 24 can be placed at an arbitrary position of the display at hand 10 and the large display 1 by dragging. The dragging of the icon 24 can be performed by moving the pointer 15 on the icon 24 and subsequent moving of the mouse 12 with pushing, for example, a left button of the mouse 12. FIG. 15 illustrates a moving manner of the icon 24 to the boiler in the system diagram 5 on the large display 1 by dragging.
  • Referring to FIGS. [0083] 16-20, a method for realizing the system 19 is explained.
  • A program for realizing the [0084] system 19 can be composed so as to be executed in any one of the workstations 2, 11, 31, 51, or in any several or all of the workstations 2, 11, 31, 51.
  • In a corresponding table [0085] 130 of events/executing process which is controlled per every display object shown in FIG. 16, the event can be divided into three categories, such as kind of operation, button number, and person in charge. The kind of operation includes the following items, and designates a kind of event.
  • (1) Button push down: Kinds of events generated by pushing down a mouse button. [0086]
  • (2) Button release: Kinds of events generated by releasing a mouse button. [0087]
  • (3) Key push down: Kinds of events generated by pushing down a key in the keyboard. [0088]
  • (4) Key release: Kinds of events generated by releasing a key in the keyboard. [0089]
  • The button number designates the button or the key which has generated the event. The person in charge designates charged task of the operator who has generated the event. [0090]
  • The executing process can be divided into two categories such as routine and output. The routine stores a process to be executed when an event is generated, and output designates an output apparatus which is used by the operator who must receive the output. The above designation of the operator is performed by designating a kind of task charged to the operator. That means, when an operator in charge of operation is designated as a destination of an output, the output is transferred to the output apparatus which is used by the operator in charge of operation. [0091]
  • FIG. 17 illustrates a [0092] format 131 for designating an output destination. Each of the bits in the format 131 corresponds to a respective charged task. A bit corresponding to a person in charge to receive the output is designated as “1”, and a bit corresponding to a person in charge not to receive the output is designated as “0”. For example, when an output must be transmitted to both, an operator and an inspector, the second bit and the third bit in the format 131 are designated as “1”, and other bits are “0”.
  • Referring to FIG. 18, a process flow when a displayed object on the [0093] large display 1 is pointed is explained hereinafter.
  • In the event of [0094] input step 140, input event cues of the workstations 11, 31, 51 are examined. If the event is stored in the input event queue, the event is taken out. The event includes information such as an input device ID which generates the event, a button number which generates the event, and a location where the event is generated. In the step 141, a table 120 (FIG. 9) is searched using the input device ID of the taken out event as a key to retrieve charged task of the operator who has generated the event. In the step 142, a displayed object in the location where the event is generated is searched based on the event generated location.
  • If no displayed object exists at the event generated location (step [0095] 143), the operation returns to the step 140 and continues to process the next input event. If any displayed object exists at the event generated location (step 143), the operation goes to step 144. In step 144, the input event items in the corresponding table 130 of the event/executing process for the displayed object which is searched in step 142 is examined whether any input items are matched with kind of operation, button number, and person in charge of the input event. If there are any event items matched with the input event (step 145), an output destination of corresponding executing process items is taken out, the table 121 (FIG. 10) is searched using the charged task stored in the output destination as a key, an output device ID is taken out, and the output device is set as for an output destination at routine execution (step 146). Subsequently, the routine stored in the routine items in the executing process items is executed (step 147).
  • When no event item matching to the input event is found in step [0096] 144 (step 145), the operation returns to the step 142, and other object at the event generated location is searched. The above described processing is repeated until the plant monitoring and controlling system 91 is ended (step 148).
  • Referring to FIGS. 19, 20, a method for realizing the pointing by the [0097] mouse 12 on the display at hand 10 and the large display 1 is explained hereinafter.
  • The explanation is performed taking the display at [0098] hand 10 as an example, but cases of the other displays at hand 30, 50 are entirely the same. In FIG. 19, H is the number of pixels in a horizontal direction of the large display 1, V is the number of pixels in a vertical direction, h is the number of pixels in a horizontal direction of the display at hand 10, v is the number of pixels in a vertical direction: q is one pixel or a several pixels.
  • The coordinate values renewed in the [0099] workstation 11 corresponding to input by the mouse 12 are expressed as (curx, cury). When the mouse 12 is moved, an amount of moving (dx, dy) is reported to the workstation 11, and the (curx, cury) is renewed by the following equation;
  • (curx, cury)=(curx, cury)+(dx, dy)  (1)
  • where, [0100]
  • 0.ltoreq.curx<h, 0.ltoreq.cury.Itoreq.v  (2)
  • That means, if a result of the renewal exceeds the region defined by the equation (2), the (curx, cury) is set as a value at a boundary. For example, −2 for cury is obtained by executing the equation (1), cury is set as 0. Origin of the coordinate on the [0101] large display 1 and the display at hand 10 is assumed to be located at top-left.
  • Referring to FIG. 20, a process flow of a method for realizing pointing by the [0102] mouse 12 on the display at hand 10 and the large display 1 is explained hereinafter. At the start of the processing, an initial setting is q<cury<v, and the pointer is displayed at a position (curx, cury) on the display at hand 10 (step 162). As far as cury>q (step 160), when any event, such as pressing a button of the mouse 12 occurs, the event processing is executed to the displayed object on the display at hand 10 (step 163). When the mouse 12 is moved forward and cury becomes less than q, that is cury<q, the pointer transfers on the large display 1. That means, a value of cury is set as hV/H (step 164), and the pointer is displayed (step 167) at a position (curx, cury)=(HcurX/h, Hcury/h−1) (step 166). As far as cury<hV/H (step 165), when any event, such as pressing a button of the mouse 12, occurs, the event processing is executed to the displayed object on the large display 1 (step 168).
  • In the above embodiment, the charged task for the operator is registered first. And, by controlling the corresponding relationship between the registered charged task and input/output device, information corresponding to the charged task is displayed and operation environment is set. However, any attribute of the operator other than the charged task can be usable. For instance, name, age, order, class, rank, sex, mother language, skillfulness can be used registration for the control. Further, not only one attribute, a several attributes connected by logical equations can be used for the registration. In accordance with the above variation, the service can be provided with contents matched with various attribute of the operator. [0103]
  • Further, in the above embodiment, a method to register the attribute of the operator by selecting the menu is used. However, the attribute of the operator may be recognized by the plant monitoring and controlling [0104] system 91 itself. For instance, the operator sitting in front of the display 10 may be recognized by the operator's face, or by the operator's voiceprint input from a microphone.
  • Furthermore, in the above embodiment, the attribute of the operator is registered at the beginning of the operation. However, the attribute of the operator may be asked (a menu for selecting the attribute is displayed), or a processing for recognizing the attribute may start, at a moment when the system needs to know the attribute of the operator. [0105]
  • In the above embodiment, a mouse is used for pointing on the [0106] large display 1, but a laser beam pointer can also be used. A pointing position on the large display 1 is determined by taking video with a video camera in front or back of the large display screen and processing the video image for determining position of the laser beam. Recognition of the device ID when a plurality of laser pointers are used is performed by using the laser pointers each having a laser beam of different color, and determining the color of the laser beam. Similarly, infrared pointers can be used. In this case, devices can be recognized by using different frequencies of infrared each other.
  • In the above embodiment, the [0107] pointer 15 moves between the display at hand 10 and the large display 1 as if the upper side of the display at hand 10 and a whole or a part of the lower side of the large display 1 were connected. However, as shown in FIG. 21, the pointer 15 may be arranged so as to move from lateral side (left side or right side) of the display at hand 10. Further, the moving manner of the pointer 15 between the display at hand 10 and the large display 1 may be set depending upon relative positions of the large display 1 and the display at hand 10. Therefore, the operator can operates as if the large display 1 were located on an extended line of the display at hand 10, and natural interface for the operator can be realized.
  • In the above embodiment, a conventional display apparatus is used as for the display at [0108] hand 10. However, a see-through display apparatus can be used as for the display at hand 10. The see-through display is a translucent display, and information displayed on it is visible with a background of the display in a superimposed manner. As one of example of such see-through display apparatus, there is a see-through head-mounted display described in a reference, Proceedings Of The ACM Symposium on User Interface Software And Technology, November, (1991) ACM Press pp. 9-17.
  • Referring to FIG. 23, the second embodiment using the see-through display of the present invention is explained hereinafter. In FIG. 23, it is assumed that an operator A uses a see-through [0109] display 1100 and another operator B uses a see-through display 1200. On the large display 1000, only information which is shared with the operator A and B is displayed. On the contrary, information necessary for only the operator A is displayed on the see-through display 1100, and other information necessary for only the operator B is displayed on the see-through display 1200. For instance, a pointer 1110 which is moved by operating a mouse at hand of the operator A is displayed on the see-through display 1100, and a pointer 1210 which is moved by operating a mouse at hand of the operator B is displayed on the see-through display 1200. Further, a menu 1120 which is displayed when the operator A presses a button of the mouse is displayed on the see-through display 1100. The pointer, the menu, detailed information and others which are displayed on the see-through display are visible with displayed object on the large display in a superimposed manner. That means, the displayed object on the see-through display looks for the operator as if it were displayed on the large display. The operator can point the displayed object on the large display arbitrarily by the pointer displayed on the see-through display.
  • Naturally, a relationship between display coordinates of the see-through [0110] displays 1100, 1200 and of the large display 1000 is maintained constant. That means, in such a case as mounting a see-through display at the overhead of the operator, a 3D tracking system is used for tracking the position and orientation of the see-through display, and the display coordinates of the see-through display are corrected in connection with a relationship with a relative position of the large display 1000.
  • Furthermore, a see-through display and a conventional display can be used together as for displays at hand. That means, information which is desired to be displayed in a superimposed manner with information displayed on the [0111] large display 1000 such as a pointer for designating a position on the large display 1000, and a menu for operating a displayed object on the large display 1000, are displayed on the see-through display, and other information which is not required to be seen in a superimposed manner with the displayed object on the large display 1000 may be displayed on the conventional display.
  • Advantages of using the see-through display for a display at hand are as follows: [0112]
  • (1) Interference between operators can be completely eliminated. Although displaying a pointer or a menu directly on the large display distracts other operators, displaying the pointer or the menu on an operator's own display at hand does not interfere with to other operators' work because the display on his own see-through display is not visible to other operators. For instance, when many pointers are displayed on the large display simultaneously, it becomes difficult to identify one specified operator's own pointer among many pointers, and it causes a problem to be solved. However, if a pointer for each operator is displayed only on his own see-through display, the above problem can be eliminated because each operator sees only his own pointer at any time. [0113]
  • (2) Information displayed on the display at hand and information displayed on the large display can be integrated visually. When a conventional display is used as the display at hand, it is necessary to move a line of sight in order to refer both the information displayed on the large display and the information displayed on the display at hand, and it is difficult to see both of the above information simultaneously. On the contrary, information displayed on the see-through display is visible with information displayed on the large display in a superimposed manner, and both of the above information can be referred to simultaneously. Furthermore, related information can be displayed next to each other. [0114]
  • In accordance with the present invention, the following advantage is achieved: [0115]
  • (1) A process matching with an operator who has operated can be executed. An attribute of the operator is registered corresponding to an input means which is used by the operator, and when the operator operates the input means, a process matching with the operator who has operated can be executed by examining the registered attribute of the operator and selecting a matched process with the attribute of the operator for executing. [0116]
  • (2) An output destination can be selected corresponding to the operator who has operated so as not to distract other operators' task. An attribute of the operator is registered corresponding to an input means which is used by the operator, and when the operator operates the input means, a result can be output without distracting other operators by examining the registered attribute of the operator and selecting an output destination of the result of the processing matched with the attribute of the operator. [0117]
  • (3) An operating environment can be set matching with the operator. An attribute of the operator is registered corresponding to an input means which is used by the operator, and when the operator operates the input means, an operating environment matching with the operator who has operated can be provided by examining the registered attribute of the operator and setting the matched operating environment with the attribute of the operator. [0118]

Claims (9)

What is claimed is:
1. An interactive computer system with plural displays comprising:
a first computer having input means and output means including a display;
a second computer having output means including display connected to said first computer;
a voice recording means for recording voice and creating an icon representing said recorded voice, said icon displayed on the display of said first computer; and
a dragging means for dragging said icon with said input means of the first computer from the display of said first computer to the display of said second computer, said recorded voice being reproduced when said icon is pointed by said input means of the first computer.
2. An interactive computer system according to claim 1 wherein said input means comprises a mouse.
3. An interactive computer system according to claim 2 wherein said dragging means includes a pointer responsive to movement of said mouse.
4. An interactive computer system with plural displays comprising:
a first computer, having an input device and an output including a display;
a second computer having an output including display coupled to said first computer;
a voice recorder to create an icon representing a recorded voice, said icon displayed on the display of said first computer; and
said input device of the first computer adapted to drag said icon from the display of said first computer to the display of said second computer, and to reproduce said recorded voice in response to a pointer displayed on said displays and controlled by said input device.
5. An interactive computer system according to claim 1 wherein said input comprises a mouse.
6. An interactive computer system according to claim 1 wherein said pointer is responsive to movement of said mouse and carries said icon with it to drag said icon from said first display to said second display.
7. An method comprising:
recording a voice;
displaying an icon on the display of a first computer representing said recorded voice;
displaying a representation of a physical system on the display of a second computer;
dragging said icon from the display of said first computer to the display of said second computer;
reproducing said recorded voice by pointing to said icon.
8. The method according to claim 7 wherein said dragging and pointing is done by providing an input to the first computer and generating a pointer that can be displayed upon and moved between the displays of the first and second computers.
9. The method according to claim 8 wherein said dragging and pointing is done by providing a mouse input to the first computer.
US10/192,726 1993-04-28 2002-07-11 Interactive control system having plural displays, and a method thereof Abandoned US20020171628A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US10/192,726 US20020171628A1 (en) 1993-04-28 2002-07-11 Interactive control system having plural displays, and a method thereof
US10/692,808 US7057602B2 (en) 1993-04-28 2003-10-27 Interactive control system having plural displays, and a method thereof

Applications Claiming Priority (7)

Application Number Priority Date Filing Date Title
JP5102159A JPH06314181A (en) 1993-04-28 1993-04-28 Interactive control system by plural display and control method therefor
JP5-102159 1993-04-28
US23036994A 1994-04-20 1994-04-20
US08/969,313 US5969697A (en) 1993-04-28 1997-11-13 Interactive control system having plural displays, and a method thereof
US09/374,263 US6100857A (en) 1993-04-28 1999-08-16 Interactive control system having plural displays, and a method thereof
US09/619,647 US6441802B1 (en) 1993-04-28 2000-07-19 Interactive control system having plural displays, and a method thereof
US10/192,726 US20020171628A1 (en) 1993-04-28 2002-07-11 Interactive control system having plural displays, and a method thereof

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US09/619,647 Continuation US6441802B1 (en) 1993-04-28 2000-07-19 Interactive control system having plural displays, and a method thereof

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US10/692,808 Continuation US7057602B2 (en) 1993-04-28 2003-10-27 Interactive control system having plural displays, and a method thereof

Publications (1)

Publication Number Publication Date
US20020171628A1 true US20020171628A1 (en) 2002-11-21

Family

ID=14319953

Family Applications (5)

Application Number Title Priority Date Filing Date
US08/969,313 Expired - Fee Related US5969697A (en) 1993-04-28 1997-11-13 Interactive control system having plural displays, and a method thereof
US09/374,263 Expired - Fee Related US6100857A (en) 1993-04-28 1999-08-16 Interactive control system having plural displays, and a method thereof
US09/619,647 Expired - Fee Related US6441802B1 (en) 1993-04-28 2000-07-19 Interactive control system having plural displays, and a method thereof
US10/192,726 Abandoned US20020171628A1 (en) 1993-04-28 2002-07-11 Interactive control system having plural displays, and a method thereof
US10/692,808 Expired - Fee Related US7057602B2 (en) 1993-04-28 2003-10-27 Interactive control system having plural displays, and a method thereof

Family Applications Before (3)

Application Number Title Priority Date Filing Date
US08/969,313 Expired - Fee Related US5969697A (en) 1993-04-28 1997-11-13 Interactive control system having plural displays, and a method thereof
US09/374,263 Expired - Fee Related US6100857A (en) 1993-04-28 1999-08-16 Interactive control system having plural displays, and a method thereof
US09/619,647 Expired - Fee Related US6441802B1 (en) 1993-04-28 2000-07-19 Interactive control system having plural displays, and a method thereof

Family Applications After (1)

Application Number Title Priority Date Filing Date
US10/692,808 Expired - Fee Related US7057602B2 (en) 1993-04-28 2003-10-27 Interactive control system having plural displays, and a method thereof

Country Status (2)

Country Link
US (5) US5969697A (en)
JP (1) JPH06314181A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050093817A1 (en) * 2003-11-03 2005-05-05 Pagan William G. Apparatus method and system for improved feedback of pointing device event processing
US20080100531A1 (en) * 2005-03-31 2008-05-01 Sega Corporation Display control program executed in game machine
US20100149101A1 (en) * 2008-12-13 2010-06-17 Yan-Liang Guo Computer keyboard
US20140173458A1 (en) * 2012-12-18 2014-06-19 Sony Corporation System and method for sharing event information using icons
US9374429B2 (en) 2012-12-18 2016-06-21 Sony Corporation System and method for sharing event information using icons

Families Citing this family (45)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1992017875A1 (en) * 1991-04-08 1992-10-15 Hitachi, Ltd. Method and apparatus for image or data processing, and monitoring method and apparatus using the same
US6195176B1 (en) * 1992-08-31 2001-02-27 Canon Kabushiki Kaisha TV conference system and terminal equipment for use in the same
JPH08147243A (en) * 1994-11-21 1996-06-07 Nec Corp Multimedia data communication system
US6118888A (en) * 1997-02-28 2000-09-12 Kabushiki Kaisha Toshiba Multi-modal interface apparatus and method
JPH11249729A (en) * 1998-03-03 1999-09-17 Mitsubishi Electric Corp Operation monitoring operating device
JP3627791B2 (en) 1998-08-10 2005-03-09 富士通株式会社 Other terminal operation device
JP2000284941A (en) * 1999-03-31 2000-10-13 Mitsubishi Electric Corp Cursor display device for multi-display system
US6597376B1 (en) * 1999-07-30 2003-07-22 Grass Valley (U.S.) Inc. Mix/effect status display
CA2420081C (en) * 1999-08-19 2011-10-11 Deep Video Imaging Limited Control of depth movement for visual display with layered screens
JP3478192B2 (en) * 1999-08-20 2003-12-15 日本電気株式会社 Screen superimposed display type information input / output device
US6701665B1 (en) * 2000-10-23 2004-03-09 Phytech Ltd. Remote phytomonitoring
US6704033B2 (en) 2000-12-05 2004-03-09 Lexmark International, Inc. Goal-oriented design for the printer property's graphical user interface
ITBO20010030A1 (en) * 2001-01-23 2002-07-23 Gd Spa METHOD AND UNIT FOR PERFORMING A CONFIGURATION CHANGE IN AN AUTOMATIC OPERATING MACHINE
US6642947B2 (en) * 2001-03-15 2003-11-04 Apple Computer, Inc. Method and apparatus for dynamic cursor configuration
US7068294B2 (en) * 2001-03-30 2006-06-27 Koninklijke Philips Electronics N.V. One-to-one direct communication
US7102643B2 (en) 2001-11-09 2006-09-05 Vibe Solutions Group, Inc. Method and apparatus for controlling the visual presentation of data
US7042469B2 (en) * 2002-08-13 2006-05-09 National Instruments Corporation Multiple views for a measurement system diagram
US7142192B2 (en) * 2002-12-12 2006-11-28 Nvidia Corporation Cursor locator for multi-monitor systems
KR100968457B1 (en) * 2003-05-14 2010-07-07 삼성전자주식회사 Computer System And Display Method Thereof
NZ525956A (en) 2003-05-16 2005-10-28 Deep Video Imaging Ltd Display control system for use with multi-layer displays
US7458029B2 (en) * 2004-01-15 2008-11-25 Microsoft Corporation System and process for controlling a shared display given inputs from multiple users using multiple input modalities
FR2868559B1 (en) * 2004-04-01 2006-07-14 Airbus France Sas PRODUCTION MANAGEMENT SYSTEM AND CORRESPONDING ALERT METHOD
JP2006245689A (en) * 2005-02-28 2006-09-14 Nippon Telegr & Teleph Corp <Ntt> Information presentation device, method and program
EP1929396A2 (en) * 2005-09-19 2008-06-11 Helmut Schröder Computer mouse, computer mouse system and method for controlling said mouse
US7869900B2 (en) * 2005-12-12 2011-01-11 Balcones Fuel Technology, Inc. Integrated cuber management system
JP4040060B2 (en) * 2005-12-28 2008-01-30 株式会社コナミデジタルエンタテインメント GAME DEVICE, DISPLAY CONTROL METHOD, AND PROGRAM
US20080208752A1 (en) * 2007-02-23 2008-08-28 Microsoft Corporation Content communication and purchase using a computer-based media component
US8144123B2 (en) * 2007-08-14 2012-03-27 Fuji Xerox Co., Ltd. Dynamically controlling a cursor on a screen when using a video camera as a pointing device
US20100013764A1 (en) * 2008-07-18 2010-01-21 Wei Gu Devices for Controlling Computers and Devices
US20110227810A1 (en) * 2010-03-19 2011-09-22 Mckinney Susan Portable communication device with secondary peripheral display
JP5765999B2 (en) * 2011-04-21 2015-08-19 三菱電機株式会社 Supervisory control system
JP2012234341A (en) * 2011-04-28 2012-11-29 Toshiba Corp Video display device and menu screen display method
JP5076013B1 (en) * 2011-06-07 2012-11-21 株式会社東芝 Information processing apparatus, information processing method, and program
US8711091B2 (en) * 2011-10-14 2014-04-29 Lenovo (Singapore) Pte. Ltd. Automatic logical position adjustment of multiple screens
US20130146084A1 (en) * 2011-12-07 2013-06-13 Caterpillar Inc. System and method for removing objects from surfaces
US10795535B2 (en) * 2012-08-28 2020-10-06 Eizo Corporation Management of multiple display areas
WO2015145559A1 (en) * 2014-03-25 2015-10-01 三菱電機株式会社 Plant monitor control system
US20150355611A1 (en) * 2014-06-06 2015-12-10 Honeywell International Inc. Apparatus and method for combining visualization and interaction in industrial operator consoles
JP6292672B2 (en) * 2014-06-26 2018-03-14 Kddi株式会社 Operation support apparatus, operation support method, and operation support program
US10343067B2 (en) * 2014-12-23 2019-07-09 King.Com Ltd. Computer system and method for selecting and displaying in-gaming options based on user selection weight criteria
CN108255364A (en) * 2016-12-28 2018-07-06 深圳市巨烽显示科技有限公司 The method and apparatus that cursor is switched fast are realized in multihead display environment
FI130442B (en) 2019-10-10 2023-09-01 Valmet Automation Oy Follow-up mode
EP3805882B1 (en) * 2019-10-10 2022-06-08 Siemens Aktiengesellschaft Control system for a technical installation with a trend curve diagram
EP3968112A1 (en) * 2020-09-11 2022-03-16 ABB Schweiz AG Visual operator interface for a technical system
US12105581B2 (en) 2020-10-07 2024-10-01 Mitsubishi Electric Corporation Failure symptom detection system, failure symptom detection method, and recording medium

Family Cites Families (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS54121379A (en) 1978-03-13 1979-09-20 Hitachi Ltd Method of monitoring hierarchy process state
GB8425827D0 (en) * 1984-10-12 1984-11-21 Gec Avionics Position indicating apparatus
DE3632601A1 (en) * 1985-09-27 1987-04-23 Olympus Optical Co DEVICE FOR DISPLAYING A POSITION BRAND ON SEVERAL SCREENS
JPH083779B2 (en) 1985-09-27 1996-01-17 オリンパス光学工業株式会社 Coordinate pointing device
JP2574755B2 (en) 1986-04-23 1997-01-22 株式会社日立製作所 Personal authentication system
DE3709400A1 (en) 1987-03-21 1988-09-29 Subklew Gmbh CONTROL CENTER
JPS63308681A (en) 1987-06-10 1988-12-16 Toshiba Corp Personal identifying device
JPH01103759A (en) 1987-10-16 1989-04-20 Nec Corp Password detecting device
US4942514A (en) * 1987-11-17 1990-07-17 Hitachi, Ltd. Process monitoring and control system and method of process monitoring and control
JPH0239307A (en) 1988-07-29 1990-02-08 Toshiba Corp Plant monitor device
JP2539502B2 (en) 1988-12-09 1996-10-02 株式会社日立製作所 Driving support device
JPH02273857A (en) 1989-04-14 1990-11-08 Fujitsu Ltd Computer system utilizing multimedia
JPH0345991A (en) 1989-07-14 1991-02-27 Hitachi Ltd Picture display method and computer system using there for
JPH03156557A (en) 1989-08-08 1991-07-04 Mitsubishi Electric Corp Computer system
JPH0371187A (en) 1989-08-11 1991-03-26 Toshiba Corp Plant state supervisory and supporting device
JP2834205B2 (en) 1989-08-18 1998-12-09 株式会社日立製作所 Screen display method and device
DE3930581A1 (en) 1989-09-13 1991-03-21 Asea Brown Boveri Work station for process control personnel - has display fields with windows accessed by mouse selection
WO1991006960A1 (en) 1989-11-02 1991-05-16 Combustion Engineering, Inc. Advanced nuclear plant control complex
JPH04491A (en) * 1990-04-18 1992-01-06 Toshiba Corp Information display device
JP2530050B2 (en) * 1990-07-20 1996-09-04 富士通株式会社 Cursor movement control device
JP2754919B2 (en) 1990-12-18 1998-05-20 富士ゼロックス株式会社 Multi-color display EL display
JP3466630B2 (en) 1991-04-12 2003-11-17 株式会社東芝 Information communication device
JPH04335698A (en) 1991-05-13 1992-11-24 Yokogawa Electric Corp Process monitoring device
JP3006730B2 (en) 1991-08-13 2000-02-07 富士ゼロックス株式会社 Information processing apparatus for joint work and information processing method for joint work
JP2892877B2 (en) * 1991-10-30 1999-05-17 富士写真フイルム株式会社 Digital line terminal device and operation method thereof
US5491743A (en) * 1994-05-24 1996-02-13 International Business Machines Corporation Virtual conference system and terminal apparatus therefor

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050093817A1 (en) * 2003-11-03 2005-05-05 Pagan William G. Apparatus method and system for improved feedback of pointing device event processing
US7542026B2 (en) * 2003-11-03 2009-06-02 International Business Machines Corporation Apparatus method and system for improved feedback of pointing device event processing
US20080100531A1 (en) * 2005-03-31 2008-05-01 Sega Corporation Display control program executed in game machine
US7948449B2 (en) 2005-03-31 2011-05-24 Sega Corporation Display control program executed in game machine
US20100149101A1 (en) * 2008-12-13 2010-06-17 Yan-Liang Guo Computer keyboard
US8411039B2 (en) * 2008-12-13 2013-04-02 Silitek Electronic (Guangzhou) Co., Ltd. Computer keyboard
US20140173458A1 (en) * 2012-12-18 2014-06-19 Sony Corporation System and method for sharing event information using icons
US9374429B2 (en) 2012-12-18 2016-06-21 Sony Corporation System and method for sharing event information using icons

Also Published As

Publication number Publication date
US6100857A (en) 2000-08-08
JPH06314181A (en) 1994-11-08
US7057602B2 (en) 2006-06-06
US5969697A (en) 1999-10-19
US6441802B1 (en) 2002-08-27
US20040085257A1 (en) 2004-05-06

Similar Documents

Publication Publication Date Title
US7057602B2 (en) Interactive control system having plural displays, and a method thereof
US5010500A (en) Gesture-modified diagram for retrieval of image resembling diagram, with parts selectable for further interactive retrieval
KR100404918B1 (en) Computerized Interactor Systems and Method for Providing Same
EP0764898B1 (en) Page turning apparatus for use with computer system
CN101695125A (en) Method and system for realizing video intelligent track navigation
JP3608940B2 (en) Video search and display method and video search and display apparatus
CN107257506A (en) Many picture special efficacy loading methods and device
DE4414360A1 (en) Interactive control system with a plurality of displays and method for operating such a system
JPH11134086A (en) Monitoring device and checking method using the same
EP1754216B1 (en) Surveillance system workstation
EP3321843A1 (en) A centralized traffic control system, and a method in relation with the system
Haunold et al. A keystroke level analysis of manual map digitizing
JP2004062904A (en) Interactive control system and control method by a plurality of displays
JPH01258098A (en) Crt monitor system for plant monitor device
JP3263008B2 (en) Disaster prevention display device
JPH0693172B2 (en) Plant monitoring control device
JP2004139234A (en) Operation device
JP2006163468A (en) Monitoring device, and method for displaying monitoring screen by operation history reduced screen
JP2785920B2 (en) Man-machine screen display device and method, and industrial equipment control device and method using the same
Tanifuji et al. Hyperplant: Interaction with Plant through Live Video
JPH06124291A (en) Handwritten character inputting method for plant operation monitor ytem and plant operation monitor system
JPH11258383A (en) Information providing method for plant operation
JPH02157922A (en) Man machine interface device
JPH083770B2 (en) Process monitoring operation device
JPH1145386A (en) Disaster prevention display device

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION