WO2015053266A1 - Plant operation training apparatus, control method, program, and plant operation training system - Google Patents

Plant operation training apparatus, control method, program, and plant operation training system Download PDF

Info

Publication number
WO2015053266A1
WO2015053266A1 PCT/JP2014/076814 JP2014076814W WO2015053266A1 WO 2015053266 A1 WO2015053266 A1 WO 2015053266A1 JP 2014076814 W JP2014076814 W JP 2014076814W WO 2015053266 A1 WO2015053266 A1 WO 2015053266A1
Authority
WO
WIPO (PCT)
Prior art keywords
plant
interface
model data
virtual space
avatar
Prior art date
Application number
PCT/JP2014/076814
Other languages
French (fr)
Japanese (ja)
Inventor
一浩 武多
賢士 坂本
琢磨 前田
Original Assignee
三菱重工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱重工業株式会社 filed Critical 三菱重工業株式会社
Publication of WO2015053266A1 publication Critical patent/WO2015053266A1/en

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • G09B19/003Repetitive work cycles; Sequence of movements
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B25/00Models for purposes not provided for in G09B23/00, e.g. full-sized devices for demonstration purposes
    • G09B25/02Models for purposes not provided for in G09B23/00, e.g. full-sized devices for demonstration purposes of industrial processes; of machinery

Definitions

  • the present invention relates to a plant operation training apparatus, a control method, a program, and a plant operation training system.
  • Patent Literature 1 describes a simulation device that can learn a route to a destination by displaying an avatar in a 3D virtual space in a plant and moving the avatar by an operator's operation. According to this technology, it is possible to learn the route to the destination while displaying the scenery seen when a person actually walks while moving to the destination and feeling the sense of real distance and realism.
  • Patent Document 1 is for learning a route by moving an avatar.
  • many plant operation training simulation apparatuses that have existed so far usually select a device provided in the plant by a mouse operation and perform an operation training on the device.
  • the present invention provides a plant operation training apparatus, a plant operation training method, and a program capable of solving the above-described problems.
  • the plant operation training apparatus obtains human model data indicating a person in the virtual space and operates the operation of the avatar displayed in the virtual space based on the human model data.
  • a positional relationship calculation unit that calculates the positional relationship between the avatar and the devices included in the plant that acquires the plant model data indicating the plant in the virtual space and displays the plant model data in the virtual space based on the plant model data;
  • a detailed screen display control unit that acquires interface detailed screen model data indicating an interface in a virtual space and displays the interface detailed screen of the devices that satisfy a predetermined condition for the positional relationship based on the interface detailed screen model data; Is provided.
  • the control method of the plant operation training apparatus acquires the human model data indicating the person in the virtual space and displays the action of the avatar displayed in the virtual space based on the human model data. Operate to obtain plant model data indicating the plant in the virtual space, calculate the positional relationship between the avatar and the equipment included in the plant displayed in the virtual space based on the plant model data, and determine the positional relationship Interface detailed screen model data indicating the interface in the virtual space is acquired and displayed based on the interface detailed screen model data.
  • the program acquires the human model data indicating the person in the virtual space by the computer of the plant operation training apparatus, and displays the action of the avatar displayed in the virtual space based on the human model data.
  • the interface detail screen of the device that satisfies a predetermined condition for the relationship is made to function as means for acquiring interface detail screen model data indicating an interface in the virtual space and displaying it based on the interface detail screen model data.
  • the plant operation training system acquires the human model data indicating a person in the virtual space and operates the operation of the avatar displayed in the virtual space based on the human model data.
  • a positional relationship calculation unit that calculates the positional relationship between the avatar and the devices included in the plant that acquires the plant model data indicating the plant in the virtual space and displays the plant model data in the virtual space based on the plant model data;
  • a detailed screen display control unit that acquires interface detailed screen model data indicating an interface in a virtual space and displays the interface detailed screen of devices that satisfy a predetermined condition for the positional relationship based on the interface detailed screen model data; and
  • the detailed screen display control unit When the display is started, a parameter acquisition unit that requests a parameter value to be displayed on the interface, a plant operation training apparatus, and a parameter value indicating the state of the equipment included in the plant are calculated, and in response to the request A plant simulation device that transmits the calculated parameter value.
  • driving training closer to the actual operation can be performed while grasping the positional relationship with the equipment provided in the plant in the 3D virtual space, thereby obtaining a higher training effect. be able to.
  • FIG. 6 is a functional block diagram of a plant operation training apparatus according to first to fifth embodiments of the present invention.
  • FIG. 6 is a diagram showing regions on the operation screen of the plant operation training apparatus according to the first to fifth embodiments of the present invention. It is a figure which shows the operation screen of the plant operation training apparatus by 1st embodiment of this invention. It is a figure which shows the processing flow of the plant operation training apparatus by 1st embodiment of this invention. It is a 1st figure used in order to demonstrate the processing flow of the plant operation training apparatus by 1st embodiment of this invention. It is a 2nd figure used in order to demonstrate the processing flow of the plant operation training apparatus by 1st embodiment of this invention.
  • FIG. 1 is a schematic functional block diagram of a plant operation training apparatus according to the first to fifth embodiments.
  • the configurations described in this block diagram are the same in the second to fifth embodiments described later.
  • symbol 1 represents the plant operation training apparatus.
  • the plant operation training apparatus 1 can reproduce the plant and the avatar, which is a part of the user, in the 3D virtual space using those 3D model data, and simulate the operation training of the equipment provided in the plant for the user. It is a device that provides functions that can be performed.
  • the plant operation training device 1 is, for example, a PC or a server device that can execute a program that realizes this function.
  • Reference numeral 2 denotes a plant simulation apparatus.
  • the plant simulation device 2 is a device that simulates the behavior of the plant by calculating state quantities such as an output of a gas turbine, a pressure in the compressor, and a temperature provided in the plant.
  • the plant simulation apparatus is connected to the plant operation training apparatus 1 via a network.
  • the plant operation training apparatus 1 receives a signal indicating the operation content performed by the avatar on the devices, and simulates the behavior of the plant based on the content.
  • the signal indicating the operation content includes, for example, information indicating opening / closing of a valve and turning on / off of a certain control device.
  • the plant simulation device 2 transmits the calculated state quantity and the like to the plant operation training device 1.
  • the user performs an operation simulating causing the avatar of the plant operation training apparatus 1 to check the indicator provided in the plant, and knows what result is obtained by the operation performed by the user through the avatar. In this way, the user can learn the operation of the equipment provided in the plant.
  • the plant simulation apparatus 2 is also connected to a training simulation apparatus (not shown) that simulates the central control room.
  • the plant simulation apparatus 2 receives a user operation using the simulation apparatus in the central control room and similarly simulates the behavior of the plant.
  • the plant operation training apparatus 1 is a part of the plant operation training system configured as described above.
  • the plant operation training apparatus 1 includes a display unit 10, an operation unit 20, a virtual space generation unit 30, a storage unit 40, and a communication unit 50.
  • the display unit 10 is a monitor such as a PC, a screen projected by a projector, or the like.
  • the user looks at the 3D image of the equipment in the plant output to the display unit 10 and performs an operation necessary for training.
  • the operation means 20 is a mouse, a game controller, a joystick or the like.
  • the user controls the operation of the avatar using these operation means.
  • an operation using not only a mouse but also a game controller or a joystick is possible. As a result, the user can smoothly move through the plant represented in the 3D virtual space and can receive training with a more realistic feeling.
  • the 3D image provided by the plant operation training device 1 is not an object selected using the operation means 20 so that it can be easily operated even when a large screen is used.
  • the operation target interface screen is displayed, and the operation using the game controller is facilitated.
  • the virtual space generation unit 30 reads the 3D plant model data and the 3D human model data from the storage unit 40, and reproduces the state of the plant and the avatar in the 3D virtual space. In addition, an operation signal is acquired from the operation unit 20, the avatar is operated according to the user's operation, a 3D image of the place where the avatar has moved is generated, and is output to the display unit 10.
  • the virtual space generation unit 30 includes a positional relationship calculation unit 31, a detailed screen display control unit 32, a corresponding part display control unit 33, a detailed screen display selection unit 34, a detailed screen selection setting unit 35, and a parameter acquisition unit 36. Yes.
  • the positional relationship calculation unit 31 calculates the positional relationship between the avatar and the devices provided in the plant.
  • the equipment provided in the plant refers to large facilities such as turbines and boilers, and operating devices and indicators provided in these facilities.
  • the positional relationship refers to the distance between the avatar and the devices, whether the avatar exists in a closed space (such as a room) provided with the devices, or whether the avatar is facing the devices.
  • the detailed screen display control unit 32 displays an interface detailed screen of an operation device or a display device in a positional relationship that satisfies the predetermined condition at a predetermined position.
  • the interface detail screen is an image enlarged so that the user can read the interface of the operation device or the display device.
  • the values of various parameters displayed on the interface are values acquired from the plant simulation apparatus 2.
  • the model data of the interface detail screen is stored in the storage unit 40 in association with the position information of the position where the device having the interface is provided.
  • Corresponding part display control unit 33 displays a part provided with a display or an operation unit having an interface indicated by the interface detail screen displayed by detailed screen display control part 32 in a manner different from other devices.
  • An aspect different from other devices means that only the part is blinked or displayed in a conspicuous color.
  • the corresponding part display control unit 33 may display the large facility equipment itself provided with a display device and an operation device in a different manner.
  • the detailed screen display selection unit 34 provides a user with a means for selecting an interface detailed screen displayed by the detailed screen display control unit 32. Specifically, the detailed screen display selection unit 34 displays the interface detailed screens of a plurality of devices in a predetermined position on the display unit 10 and selects whether or not to display sequentially from the detailed screen displayed in the foreground. Prompt the user. The detailed screen display selection unit 34 provides a function for preventing display of unnecessary interface details screens of equipment when there are many instruments at a certain plant location.
  • the detailed screen selection setting unit 35 provides a function of specifying an interface detail screen of selectable devices displayed by the detailed screen display selection unit 34 and setting the result.
  • the detail screen selection setting unit 35 provides a function for setting not to display the interface detail screen of the device that is not used by the user.
  • the equipment that is not used by the user is, for example, equipment that determines that a training instructor is unnecessary according to the level of the user who receives plant operation training, or equipment that should be used only in an emergency or emergency.
  • the parameter acquisition unit 36 requests the parameter value to be displayed on the interface from the plant simulation apparatus 2 when the detailed screen display control unit 32 starts displaying the interface detailed screen.
  • the parameter acquisition unit 36 requests a parameter value via the communication unit 50 and acquires the parameter value from the plant simulation apparatus 2.
  • the parameter value is a state quantity such as the pressure and temperature of the plant calculated by the plant simulation apparatus 2.
  • the plant simulation apparatus 2 calculates the pressure, temperature, and the like at the position where the gauges and thermometers provided in the plant are provided, and the state quantity is obtained according to the request of the parameter acquisition unit 36.
  • the detail screen display control unit 32 displays the interface detail screen of each device in which the parameter value acquired by the parameter acquisition unit 36 is set.
  • the parameter acquisition part 36 will transmit the signal which shows the operation content to the plant simulation apparatus 2 via the communication part 50, if a user operates with respect to an operating device via an avatar.
  • the storage unit 40 stores plant 3D model data, human 3D model data, equipment interface detail screen model data, and the like. In addition, the setting contents by the detailed screen selection setting unit 35 are also stored. Note that the model data on the device interface detail screen may be 2D image data.
  • the communication unit 50 communicates with other devices. The communication part 50 transmits / receives the parameter value information of the plant state quantity, for example.
  • the virtual space generation unit 30 is a function provided when a CPU (Central Processing Unit) provided in the plant operation training apparatus 1 executes a program.
  • FIG. 2 is a diagram showing each display area on the operation screen of the plant operation training apparatus 1 according to the present embodiment.
  • the regions described here are the same in the second to fifth embodiments.
  • the position and meaning of displaying the interface detail screen will be described with reference to FIG.
  • Reference numeral 104 denotes an area where the detailed screen display control unit 32 displays the interface details screen of the devices. Hereinafter, this area is referred to as an operable area.
  • the device interface detail screen displayed in the operable area 104 is a user's operation target.
  • Reference numeral 105 denotes an area where the detail screen display selection unit 34 displays the interface detail screen in an overlapping manner. Hereinafter, this area is referred to as a selection area.
  • the detailed screen display control unit 32 displays only the interface image of the device selected by the user from the device options displayed in the selection area 105 by the detailed screen display selection unit 34 in the operable area.
  • Reference numeral 106 denotes an area for displaying an interface detail screen that is not selected by the user. Hereinafter, this area is referred to as an operation impossible area.
  • Reference numeral 107 denotes an interface details screen. In the interface image, an interface of a display device or an operation device is displayed. The user can know the state of the plant by looking at the interface detail screen 107 of the display. The user can operate the device using the operation means 20 on the interface detail screen 107 of the operation device.
  • FIG. 3 is a diagram showing an operation screen of the plant operation training apparatus 1 according to the present embodiment.
  • An operation in which the detailed screen display control unit 32 displays the interface detail screen 107 of the operation device or the display device according to the positional relationship calculated by the positional relationship calculation unit 31 will be described with reference to FIG.
  • Reference numeral 100 is a 3D image simulating a certain place inside the plant.
  • Reference numeral 101 denotes an avatar. It is assumed that the avatar 101 has reached the place 100 in the plant by the operation of the operation means 20 by the user.
  • the virtual space generation unit 30 acquires the operation signal from the operation unit 20, the virtual space generation unit 30 calculates the movement distance based on the position before the operation of the avatar 101.
  • the movement distance may be calculated, for example, so as to advance a predetermined distance every time a predetermined button is pressed.
  • the virtual space generation unit 30 calculates position information of the avatar in the plant every time the avatar 101 moves and outputs it to the memory. In addition, the virtual space generation unit 30 reproduces the scenery of the plant near the position based on the position information of the movement destination of the avatar 101.
  • the place 100 is a 3D image generated by the virtual space generation unit 30 in this way.
  • the plant 3D model data stored in the storage unit 40 includes position information of the equipment in the plant.
  • the positional relationship calculation unit 31 calculates the distance between the avatar 101 and the devices provided in the plant using the respective position information in the plant.
  • FIG. 3 it is assumed that a high-pressure feed water flow meter 102 and a high-pressure feed valve rear valve 103 are provided in the vicinity of the place 100.
  • the distance between the avatar 101, the high-pressure water supply flow meter 102, and the high-pressure water supply valve rear valve 103 is within a predetermined distance (first threshold).
  • the detailed screen display control unit 32 acquires the model data of the interface detailed screen of the high-pressure feed valve rear valve and the high-pressure feed flow meter from the storage unit 40 and displays the interface detail screen 107 in the operable area 104.
  • the interface detail screen 107 of the high-pressure feed water flow meter 102 and the interface detail screen 107 of the high-pressure feed valve rear valve 103 are displayed in the operable area 104 so as to overlap the front of the plant 3D image.
  • the user can grasp what value the high-pressure feed water flow meter 102 indicates from the interface detail screen 107.
  • the user can adjust the opening / closing degree of the high-pressure water supply valve rear valve 103 and change the flow rate by operating the interface detail screen 107 using the operation means 20.
  • the mouse when performing a valve opening / closing operation, the mouse is first moved to the vicinity of the 3D image of the valve, the valve is selected by a click operation, and further a valve opening / closing instruction is performed by a click operation or the like.
  • a click operation When the operation is completed, it is necessary to close the window by clicking the X mark at the upper right corner of the valve image window.
  • the user moves the avatar 101, and the detailed screen display control unit 32 displays the interface details screen of the device just by approaching the device that is the subject of the operation training. There is no need to perform any special operations. Therefore, the user can grasp the positional relationship with the devices and can work on driving training with a more realistic feeling.
  • the game controller when used, it is difficult to perform an operation in which the user selects a device to be operated. However, since the interface details screen 107 of the device is displayed only when the avatar 101 approaches the devices as described above, no selection operation is necessary. To move the avatar 101 with the game controller, for example, a joystick is used.
  • FIG. 3 the operation when the avatar 101 approaches the device has been described.
  • the detailed screen display control unit 32 hides the interface detailed screen 107 when the avatar 101 leaves the device.
  • the positional relationship calculation unit 31 calculates the distance between them.
  • the detailed screen display control unit 32 hides the interface detailed screen 107 displayed in the operable area 104 when the distance between the avatar 101 and the device is equal to or greater than the second threshold.
  • the user can hide the interface detail screen 107 that is not used simply by moving the avatar 101 away from the device, and does not need to perform complicated operations.
  • the operation of the operation device displayed on the interface detail screen 107 will be briefly described. In FIG.
  • the interface detail screen 107 of the high-pressure feed water flow meter 102 is currently active by a user's predetermined operation. Active means that it is in a state of accepting a user operation.
  • the user switches the active interface detail screen 107 by pressing the right key of the cross key of the game controller, for example. Then, the interface detail screen 107 of the high pressure water supply valve rear valve 103 becomes active.
  • the user opens and closes the high-pressure water supply valve rear valve 103 using the up and down buttons of the cross key, for example. According to the present embodiment, it is possible to move the avatar, display / hide the device interface, and operate the operation device with a simple operation.
  • the display / non-display determination of the interface detail screen 107 is determined by the distance between the avatar 101 and the device, but in addition to that, the orientation of the avatar 101 may be added to the determination. For example, even if the device is provided near the avatar 101, the device provided on the back side of the user is not displayed. Further, even if the device is displayed in the operable area 104, the interface detail screen 107 may be hidden when the avatar 101 faces in another direction. As a result, the user can learn the position where the devices are installed more accurately. In addition, when the equipment is provided in a room, the position information of the avatar 101 is compared with the position information of the room in which the equipment is provided, and when the avatar 101 enters the room, the room is provided. The device interface detail screen 107 may be displayed in the operable area 104. Further, the interface detail screen 107 may be hidden when the avatar 101 leaves the room.
  • FIG. 4 is a first diagram showing a processing flow of the display device according to the present embodiment.
  • FIG. 5A is a first diagram used for explaining the processing flow of the plant operation training apparatus according to the first embodiment of the present invention.
  • FIG. 5B is a second diagram used for explaining the processing flow of the plant operation training apparatus according to the first embodiment of the present invention.
  • the positional relationship calculation part 31 acquires the combination of the apparatus (each operation device and indicator) in a plant, and its positional information from the memory
  • the position information is three-dimensional coordinate information.
  • the positional relationship calculation unit 31 may acquire the position information of the devices of the entire plant, or only the position information of the devices provided within a predetermined distance based on the route along which the avatar 101 moves. You may acquire one after another.
  • the positional relationship calculation unit 31 counts the total number of acquired devices and stores it in the memory.
  • FIG. 5A is a diagram illustrating an example of a table read by the positional relationship calculation unit 31 in step S1. This table is held in the storage unit 40.
  • the positional relationship calculation unit 31 includes an “TAG” field in which the identifier of the device is stored from the table shown in FIG. 5A and an “X coordinate”, “Y coordinate”, and “Z coordinate” column that indicate the position information of the device in the plant. Get the value and store it in memory.
  • the virtual space generation unit 30 acquires user operation information from the operation means 20 and moves the avatar 101 to a place corresponding to the operation information.
  • the virtual space generation unit 30 reads the 3D model data of the plant from the storage unit 40 and generates a 3D image showing a scene entering the field of view from the position where the avatar 101 has moved. Then, the virtual space generation unit 30 outputs the newly generated 3D image to the display unit 10. In addition, the virtual space generation unit 30 outputs position information of a place where the avatar 101 exists to the memory.
  • the positional relationship calculation part 31 acquires the positional information in the plant of the avatar 101 from memory (step S2).
  • the positional relationship calculation unit 31 uses the positional information acquired in steps S1 and S2 to determine the avatar 101 and the device i for one of the devices (operator or display) acquired in step S1 (device i).
  • the distance i which is the distance of is calculated (step S3).
  • the distance i is ⁇ (X2-X1) 2 + (Y2), where (X1, Y1, Z1) is the position information of the place where the avatar 101 exists and (X2, Y2, Z2) is the position information of the device i. It can be obtained by calculating the square root of -Y1) 2 + (Z2-Z1) 2 ⁇ .
  • the positional relationship calculation unit 31 compares the distance i calculated in step S3 with a predetermined first threshold (step S4).
  • step S4 Yes
  • the first threshold is a value used to determine whether or not the avatar 101 displays the interface detail screen 107 of the device i.
  • the detailed screen display control unit 32 determines whether the interface detailed screen 107 of the device i indicated by the acquired identifier has already been displayed (step S5).
  • An example of the determination method in step S5 will be described with reference to FIG. 5B.
  • FIG. 5B shows an example of a table that is referred to when the detailed screen display control unit 32 determines whether the interface detailed screen 107 of the device i has already been displayed. In this figure, if the value of the “displayed flag” field is “0”, it indicates that the interface detail screen 107 of the device having the “TAG” field of the record as an identifier is not displayed in the operable area. A value of “1” indicates that the interface image of the device has already been displayed in the operable area.
  • step S5 Yes
  • the detail screen display control unit 32 does not display the interface detail screen 107 of the device i.
  • the case where the interface detail screen 107 of the device i is already displayed may be a case where the avatar 101 approaches the device i and moves back and forth (the range not exceeding the second threshold described later).
  • step S5 No
  • the detailed screen display control unit 32 stores image data indicating the interface details screen 107 of the operation unit or display corresponding to the identifier of the device i. The data is read from 40 and displayed in the operable area of the display unit 10 (step S6).
  • the detailed screen display control unit 32 reads the table of FIG. 5A using the identifier (“TAG”), and acquires the value of the “interface image data” column of the corresponding record. Then, the detailed screen display control unit 32 specifies the image data indicated by this value, and generates the interface detailed screen 107 of the device i based on the specified image data. When the detailed screen display control unit 32 completes these processes, it outputs a signal indicating that the process has been completed to the positional relationship calculation unit 31.
  • TAG identifier
  • step S10 Yes
  • step S4 The case where the distance between the position of the avatar 101 and the position of the device i is smaller than the first threshold in step S4 has been described so far.
  • the detailed screen display control unit 32 does not display the interface detailed screen 107 of the device i + 1.
  • the positional relationship calculation unit 31 compares the distance i + 1, which is the distance between the avatar 101 and the device i + 1, with the second threshold value (step S7).
  • the second threshold is a value used to determine whether or not to hide the interface detail screen 107 when the avatar 101 moves away from the device i.
  • step S7 No
  • the process for the device i + 1 is completed (the device i + 1 is not particularly performed), and the determination in step S10 is performed.
  • the positional relationship calculation unit 31 outputs information indicating the identifier of the device i + 1 to the detailed screen display control unit 32.
  • the detailed screen display control unit 32 determines whether or not the interface detailed screen 107 of the device i + 1 indicated by the acquired identifier is already displayed in the operable area (step S8).
  • the determination method may be the same as in step S5.
  • the positional relationship calculation unit 31 is notified of the completion of the process.
  • the detailed screen display control unit 32 displays the interface details screen 107 of the operation device or display device corresponding to the identifier of the device i + 1 displayed in the operable area. Is performed (step S9).
  • step S10 When the control for displaying / hiding interface images for all devices is completed as described above, this processing flow ends.
  • FIG. 6 is a diagram showing an operation screen of the plant operation training apparatus 1 according to the present embodiment. A method for providing a means by which the detailed screen display selection unit 34 can select the interface detailed screen 107 to the user will be described with reference to FIG.
  • FIG. 6 is a 3D image when the avatar 101 is present at the plant location 100. In FIG. 6, it is assumed that the high pressure feed valve rear valve 103 and the high pressure feed water flow meter 102 are provided within a distance smaller than the first threshold value from the avatar 101.
  • the detail screen display selection unit 34 displays the interface detail screen 107 of the high-pressure feed water flow meter 102 and the interface detail screen 107 of the high-pressure feed valve rear valve 103 in the selection area 105 of the display unit 10 in an overlapping manner. Further, the detailed screen display selection unit 34 displays, for example, a red frame 111 indicating that it is a target for accepting an operation from the user on the outer frame of the interface detail screen 107 of the device displayed on the forefront.
  • the detailed screen display control unit 32 moves and displays the interface detailed screen 107 in the operable area. Further, when the user performs an operation without selecting a device, the detailed screen display selection unit 34 deletes the interface detail screen 107 from the selection area, and displays another interface detail screen 107 displayed below the frontmost screen. To display.
  • the predetermined operation may be, for example, pressing a predetermined button of the game controller or performing a click operation with a mouse. Further, when the user performs another predetermined operation indicating that the device is not selected, the detail screen display selection unit 34 deletes the interface detail screen 107 from the selection area 105 and displays the next interface detail screen 107 in the selection area. 105 is displayed on the foreground. The result selected by the user is stored in the storage unit 40 by the detailed screen display selection unit 34.
  • the number of interface detail screens 107 displayed in the operable area 104 can be reduced.
  • the user can reduce the trouble of selecting a device to be operated from the interface detail screen 107 displayed in the operable area 104.
  • the detailed screen display selection unit 34 displays the interface detail screens 107 one by one in the selection area 105 and displays images of different devices one after another at the same position each time selection is completed. Does not need to select a device to be selected with a mouse or the like.
  • the detailed screen display selection unit 34 may display the interface detail screen 107 in the selection area 105 when a predetermined operation key is pressed when the user wants to select again.
  • the interface detail screen 107 is hidden.
  • the interface detail screen 107 is moved to the non-operational area 106 and displayed. It is also possible to display the images in a superimposed manner. In this case, referring to the interface detail screen 107 displayed in the non-operational area 106, the user can grasp what other devices are provided. Furthermore, it is good also as operation
  • FIG. The user can display the interface detail screen 107 of the device in the operable area 104 by selecting again the device that is determined to be unnecessary once.
  • FIG. 7 is a diagram showing a processing flow of the display device according to the present embodiment.
  • the processing described in FIG. 6 will be described in detail using the processing flow of FIG.
  • the positional relationship calculation unit 31 acquires the position information of the avatar 101 and the position information of the devices from the memory, calculates these distances, and the distance to the avatar 101 is greater than the first threshold. Identify equipment in a small location. Then, the positional relationship calculation unit 31 outputs information indicating the identifiers of the specified devices to the detailed screen display selection unit 34.
  • the detailed screen display selection unit 34 reads the interface detailed screen from the storage unit 40 using the acquired identifier of the device, and displays the interface detailed screen on the selection area 105 (step S11).
  • the order of overlapping may be, for example, in the order of identifiers, or may be displayed on the front side closer to the avatar 101.
  • Step S11 the detailed screen display selection unit 34 uses, for example, a table shown in FIG.
  • a value indicating that “display / do not display” or neither of them is set in the operation area for the device having the identifier is set. If the value in the “operable flag” field is “0”, it indicates that the device has not been set at all. A value of “1” indicates that “display in the operation area” is set for the device. A value of “2” indicates that the device is set to “not display in the operation area”.
  • step S11 the detailed screen display selection unit 34 reads this table, selects devices whose “operation possible flag” value is “0”, and displays an interface image in the selection area only for those devices. .
  • the detailed screen display selection unit 34 outputs the identifier information to the detailed screen display control unit 32, and the detailed screen display control unit 32 outputs the identifier information to the operable area.
  • Devices whose flag value is “2” are not displayed.
  • the detailed screen display selection unit 34 displays the interface image of the device indicated by the TAG “A-101” in the selection area.
  • the detailed screen display control unit 32 displays the image interface of the device indicated by the TAG “B-101” in the operable area.
  • the interface image of the device indicated by the TAG “C-102” has already been set not to be displayed, so it is not displayed anywhere.
  • the detail screen display selection unit 34 performs control so that the interface detail screen 107 displayed at the forefront becomes the target of the user's selection operation. Specifically, the detailed screen display selection unit 34 focuses on the interface image to make it active (step S12). Further, the detailed screen display selection unit 34 may display a red frame on the outer peripheral portion of the interface detail screen 107 so that the user can easily understand. Next, the detailed screen display selection unit 34 detects a signal indicating the selection operation by the user from the operation unit 20 and determines whether or not the user has selected to display the interface image of the device i in the operable area ( Step S13).
  • the detailed screen display selection unit 34 When the detailed screen display selection unit 34 acquires a signal indicating that the device i is selected, the detailed screen display selection unit 34 outputs the identifier of the device i to the detailed screen display control unit 32. Next, the detailed screen display control unit 32 moves and displays the interface detailed screen 107 of the device i from the selection area 105 to the operable area 104 based on the acquired identifier (step S14). Then, the detailed screen display selection unit 34 records in the storage unit 40 that “display” is selected in the operable area 104 for the device i. For example, in the case of the example of FIG. 8, “1” is set in the “operation possible flag” column of the record having the TAG value indicating the device i.
  • the detailed screen display selection unit 34 When the detailed screen display selection unit 34 acquires a signal indicating that the device i is not selected, the detailed screen display selection unit 34 hides the interface image of the device i (step S15). Then, the detailed screen display selection unit 34 records in the storage unit 40 that “not display” is selected in the operable area for the device i. For example, in the case of the example of FIG. 8, “2” is set in the “operation possible flag” field of the record having the TAG value indicating the device i.
  • FIG. 9 is a diagram showing a setting screen of the plant operation training apparatus 1 according to the present embodiment.
  • FIG. 9 shows an example of a setting screen on which a user who acts as an instructor for plant operation training can set a device for displaying the interface detail screen 107 in the selection area 105.
  • Reference numeral 110 indicates a setting screen.
  • Reference numeral 112 denotes a check box group that can set display / non-display in the selection area 105 for each device.
  • the device “operator 1” corresponding to the check box 112a checked by the user is a device to be displayed in the selection area by the detailed screen display selection unit 34.
  • the device “display 3” corresponding to the check box 112b not checked by the user is a device that the detailed screen display selection unit 34 does not display in the selection area. Even if the “operator 1” is checked, the detailed screen display selection unit 34 may be configured to “operate” after the user has set whether or not to display the “operator 1” in the operable area 104.
  • the interface detail screen 107 of “device 1” is not displayed in the selection area 105.
  • Reference numeral 108 denotes an OK button.
  • the detailed screen selection setting unit 35 stores the content set by the user on this screen in the table shown in FIG. Then, the setting screen 110 is closed.
  • An example of the operation in which the detailed screen selection setting unit 35 stores the setting value in the table will be described using the table shown in FIG.
  • values may be set as follows. Originally, when the value of the “operation possible flag” is “0” or “1” for the device, the value is maintained as it is. When the value is “2”, the detailed screen selection setting unit 35 sets “0”.
  • the value in the “operable flag” column is “2” originally, it means that the device has been set not to be displayed in the operable area. However, it is considered appropriate to check the “display” check box on the setting screen 110 for such a device again to give the user an opportunity to select whether or not to display in the operation selection area. Because. Further, when the user removes the check from the device that has been checked up to now, the detailed screen selection setting unit 35 sets “2” in the “operation possible flag” field. By setting “2”, the interface image of the device is not displayed in the selection area or the operable area.
  • Numeral 109 indicates a cancel button.
  • the detailed screen selection setting unit 35 closes the setting screen 110 without updating the table in the storage unit 40.
  • This function can be used, for example, to select a device to be displayed in accordance with the training content of a user who is trained by a driving training instructor. For example, when the level of a user who is trained is a beginner, it is desirable to display only the interface images of the minimum necessary devices, but this is possible by using the settings of this embodiment. This function can also limit the display of devices that are not normally used. Note that this embodiment can be used in combination with the first embodiment. In that case, for example, control is performed as follows.
  • the detailed screen display control unit 32 refers to the table in which the detailed screen selection setting unit 35 records whether or not to display for each device.
  • “display” is set in this table. Only the image of the connected device is displayed.
  • an interface image of a necessary device can be displayed to a user who does not have knowledge of selecting the necessary device from the selection area.
  • FIG. 10 is a diagram showing an operation screen of the plant operation training apparatus 1 according to the present embodiment.
  • the operation in which the corresponding part display control unit 33 displays the part provided with the device corresponding to the interface image selected by the user in a manner different from other devices will be described with reference to FIG.
  • FIG. 10 is a 3D image showing that the interface detail screen 107 of the high-pressure water supply valve rear valve 103 is displayed in the operable area 104 when the avatar 101 moves to the location 100 in the plant by the user's operation.
  • the high-pressure water supply valve rear valve 103 is provided within a distance smaller than the first threshold value with the avatar 101, and reference numeral 113 indicates a portion where the high-pressure water supply valve rear valve 103 is provided.
  • the corresponding part display control unit 33 displays a red frame on the outer peripheral portion of the interface detail screen 107. Then, the corresponding part display control unit 33 makes the part 113 provided with the high-pressure water supply valve rear valve 103 blink to make it easy for the user to see. Moreover, when an avatar approaches a predetermined distance from a large equipment such as a turbine equipped with many instruments, the corresponding part display control unit 33 first displays the large equipment in a different color or the like. It may be displayed prominently. When the user selects a device in the operable area 104, the corresponding part display control unit 33 may blink and display the part provided with the device.
  • the user can know where the device to be operated is provided. With such an operation, it is possible to learn the operation method while grasping the installation location of the equipment, so that it is possible to expect the same high training effect as when performing a practical training using an actual plant.
  • FIG. 11 is a diagram showing a processing flow of the display device according to the present embodiment.
  • the processing described in FIG. 10 will be described in detail using the processing flow in FIG.
  • the avatar 101 moves in accordance with a user instruction and exists at a location 100 in the plant.
  • an interface detail screen 107 of devices within a distance range smaller than the first threshold is displayed in the operable area 104 with the avatar 101.
  • the operation means 20 when the user operates the operation means 20, one is selected from the interface details screen 107 displayed in the operable area 104.
  • the corresponding part display control unit 33 activates the interface detail screen 107 and performs control to display a red frame on the outer peripheral portion (step S31).
  • the corresponding part display control unit 33 blinks and displays the 3D image of the device corresponding to the selected interface detail screen 107.
  • the corresponding part display control unit 33 specifies a program for generating a 3D image of the device using the identifier of the device corresponding to the selected interface detail screen 107.
  • the corresponding part display control unit 33 instructs the specified program to change or blink the color of the part (step S32).
  • an RGB value indicating the changed color may be specified in a program for generating a 3D image, and the device may be blinked and the color may be changed.
  • it is preferable to select a conspicuous color so that the color to be changed or blinked is easily noticeable by the user.
  • the corresponding part display control unit 33 detects whether another interface display screen displayed in the operable area is selected by the user's selection operation (step S33). When another interface screen is selected, the processing from step S31 is repeated. In addition, for a non-selected device, an instruction is given to the program that generates a 3D image of the device to return to the original display mode.
  • FIG. 12A is a first diagram showing parameter values displayed on the interface details screen of the plant operation training apparatus 1 according to the present embodiment.
  • FIG. 12B is a second diagram showing parameter values displayed on the interface details screen of the plant operation training apparatus 1 according to the present embodiment.
  • 12A and 12B when the avatar 101 exists at the location 100, the storage unit 40 displays the list of devices provided at a distance closer to the avatar 101 than the first threshold and the interface details screen of each device. The parameter value which has is shown.
  • FIG. 12A shows the parameter value values of each device before the interface detail screen 107 of each device is displayed in the operable area.
  • the storage unit 40 does not yet have a parameter value to be displayed on the interface of each device.
  • FIG. 12B shows parameter values of each device when the interface detail screen 107 of each device is displayed in the operable area.
  • the storage unit 40 has each parameter value acquired from the plant simulation apparatus 2.
  • the processing capacity of the CPU of the plant operation training apparatus 1 is consumed for communication processing, and the calculation load increases.
  • 3D image generation processing for driving training is affected, and the 3D image display cannot be displayed smoothly.
  • only necessary parameter values are extracted and communicated to reduce the communication load, thereby reducing the calculation load of the plant operation training apparatus 1 that displays the 3D virtual plant image. Thereby, the effect that 3D image display can be displayed smoothly is acquired.
  • the plant operation training system which consists of the plant operation training apparatus 1 and the plant simulation apparatus 2 (and the training simulator apparatus of the central control room) is designed on the assumption that a plurality of users use it at the same time.
  • the user A trains the operation method of the device A using the plant operation training apparatus 1A
  • another user B trains the operation method of the device B using the plant operation training apparatus 1B
  • the plant operation training apparatuses 1A and 1B are both connected to the plant simulation apparatus 2 via a network.
  • the plant simulation apparatus 2 influences the operations A and B on the state of the plant. It calculates and transmits the result to the plant operation training apparatuses 1A and 1B. In this way, this system provides a function of operation training in which a single plant is operated at a plurality of locations at the same time.
  • FIG. 13 is a first diagram illustrating a processing flow of the display device according to the present embodiment.
  • the processing described in FIGS. 12A and 12B will be described using the processing flow of FIG.
  • the processing flow is executed immediately before the detailed screen display control unit 32 displays the interface detailed screen of the devices within the distance range smaller than the first threshold value in the operable area 104 with the avatar 101.
  • the detailed screen display control unit 32 acquires the identifier (TAG) of one device among the devices to be displayed on the interface detailed screen 107 in the operable area 104 in the same manner as step S4 in FIG.
  • TAG identifier
  • the parameter acquisition unit 36 requests the latest parameter value for the TAG from the plant simulation device 2 via the communication unit 50 (step S42).
  • the plant simulation apparatus 2 calculates the plant state quantity at each time and holds it for each TAG information.
  • the plant simulation apparatus 2 transmits the latest parameter value in response to the request from the parameter acquisition unit 36.
  • the parameter acquisition unit 36 acquires the parameter value of the device corresponding to the TAG requested from the plant simulation device 2 via the communication unit 50 (step S43).
  • the parameter acquisition unit 36 then outputs the acquired parameter value to the detailed screen display control unit 32.
  • the detailed screen display control unit 32 outputs the interface detailed screen 107 including the acquired parameter value to the operable area (step S44).
  • the detailed screen display control unit 32 determines whether or not the above processing has been performed for all devices to be displayed in the operable area 104 (step S45), and repeats the processing from steps S41 to S44 for all devices.
  • the plant operation training apparatus 1 described above has a computer inside.
  • Each process of the plant operation training apparatus 1 described above is stored in a computer-readable recording medium in the form of a program, and the above process is performed by the computer reading and executing this program.
  • the computer-readable recording medium means a magnetic disk, a magneto-optical disk, a CD-ROM, a DVD-ROM, a semiconductor memory, or the like.
  • the computer program may be distributed to the computer via a communication line, and the computer that has received the distribution may execute the program.
  • the program may be for realizing a part of the functions described above. Furthermore, what can implement
  • the operation training closer to the actual operation is performed while grasping the positional relationship with the equipment provided in the plant. Can achieve higher training effects.

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • General Physics & Mathematics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Processing Or Creating Images (AREA)

Abstract

 Human model data showing humans in a virtual space is acquired and the actions of avatars displayed in the virtual space are operated on the basis of the human model data; plant model data showing a plant in the virtual space is acquired, and the positional relationships between equipment provided to the plant displayed in the virtual space, and the avatars, is calculated on the basis of the plant model data; and interface detail screen model data, showing the interface in the virtual space of the interface detail screens of equipment fulfilling prescribed conditions related to the positional relationships, is acquired and displayed on the basis of the interface detail screen model data.

Description

プラント運転訓練装置、制御方法、プログラム及びプラント運転訓練システムPlant operation training apparatus, control method, program, and plant operation training system
 本発明は、プラント運転訓練装置、制御方法、プログラム及びプラント運転訓練システムに関する。
 本願は、2013年10月11日に、日本に出願された特願2013-213698号に基づき優先権を主張し、その内容をここに援用する。
The present invention relates to a plant operation training apparatus, a control method, a program, and a plant operation training system.
This application claims priority based on Japanese Patent Application No. 2013-213698 filed in Japan on October 11, 2013, the contents of which are incorporated herein by reference.
 近年では3Dデータを利用したプラント等の運転訓練シミュレーション装置が存在する。3Dデータを利用した運転訓練シミュレーション装置の特徴の一つとして、オペレータ自身がプラントにおける機器類に対して現実に動作を行っている感覚を得ることができ、高い訓練効果が得られることが挙げられる。
 例えば特許文献1には、プラント内の3D仮想空間にアバターを表示させ、オペレータの操作によってアバターを移動させることによって目的地までの経路を学習することができるシミュレーション装置について記載されている。この技術によれば目的地までの移動中に、実際に人間が歩行した時に目にする風景を表示させて実際の距離感や臨場感を感じながら目的地までの経路を学習することができる。
In recent years, there are operation training simulation apparatuses for plants and the like using 3D data. One of the features of the driving training simulation device using 3D data is that the operator can get a sense that the operator is actually operating on the equipment in the plant, and a high training effect can be obtained. .
For example, Patent Literature 1 describes a simulation device that can learn a route to a destination by displaying an avatar in a 3D virtual space in a plant and moving the avatar by an operator's operation. According to this technology, it is possible to learn the route to the destination while displaying the scenery seen when a person actually walks while moving to the destination and feeling the sense of real distance and realism.
日本国特許第3214776号公報Japanese Patent No. 3214776
 しかし、特許文献1の装置はアバターの移動によって経路を学習するためのものであって、例えばプラント内を歩き回ってどこにどのような設備があるかなどの学習をすることはできても、実際にプラントにおける機器類の操作手順を学習することはできない。
 また、これまで存在したプラントの運転訓練シミュレーション装置においては通常マウス操作によってプラントに備えられた機器類を選択し、その機器類に対して操作訓練を行う方式のものが多い。しかし、この方法は、現実に物に近づいたら表示内容が確認できる、操作ができるという実際のオペレータが得ることができる位置関係の把握が難しく、プラント設備の3D画像が表示されていたとしても臨場感や現実感に欠ける。
However, the apparatus of Patent Document 1 is for learning a route by moving an avatar. For example, although it is possible to learn about what kind of equipment is located around the plant, It is not possible to learn the operating procedures of equipment in the plant.
In addition, many plant operation training simulation apparatuses that have existed so far usually select a device provided in the plant by a mouse operation and perform an operation training on the device. However, in this method, it is difficult to grasp the positional relationship that can be obtained by the actual operator that the display contents can be confirmed when the object is actually approached, and that the operation can be performed, even if a 3D image of the plant equipment is displayed. Lack of feeling and reality.
 本発明は、上述の課題を解決することのできるプラント運転訓練装置、プラント運転訓練方法及びプログラムを提供する。 The present invention provides a plant operation training apparatus, a plant operation training method, and a program capable of solving the above-described problems.
 本発明の第1の態様によれば、プラント運転訓練装置は、仮想空間における人間を示す人間モデルデータを取得して当該人間モデルデータに基づいて仮想空間内に表示したアバターの動作を操作する操作手段と、仮想空間におけるプラントを示すプラントモデルデータを取得して当該プラントモデルデータに基づいて仮想空間内に表示したプラントが備える機器類と前記アバターとの位置関係を算出する位置関係算出部と、前記位置関係について所定の条件を満たす前記機器類のインタフェース詳細画面を仮想空間におけるインタフェースを示すインタフェース詳細画面モデルデータを取得して当該インタフェース詳細画面モデルデータに基づいて表示する詳細画面表示制御部と、を備える。 According to the first aspect of the present invention, the plant operation training apparatus obtains human model data indicating a person in the virtual space and operates the operation of the avatar displayed in the virtual space based on the human model data. A positional relationship calculation unit that calculates the positional relationship between the avatar and the devices included in the plant that acquires the plant model data indicating the plant in the virtual space and displays the plant model data in the virtual space based on the plant model data; A detailed screen display control unit that acquires interface detailed screen model data indicating an interface in a virtual space and displays the interface detailed screen of the devices that satisfy a predetermined condition for the positional relationship based on the interface detailed screen model data; Is provided.
 本発明の第2の態様によれば、プラント運転訓練装置の制御方法は、仮想空間における人間を示す人間モデルデータを取得して当該人間モデルデータに基づいて仮想空間内に表示したアバターの動作を操作し、仮想空間におけるプラントを示すプラントモデルデータを取得して当該プラントモデルデータに基づいて仮想空間内に表示したプラントが備える機器類と前記アバターとの位置関係を算出し、前記位置関係について所定の条件を満たす前記機器類のインタフェース詳細画面を仮想空間におけるインタフェースを示すインタフェース詳細画面モデルデータを取得して当該インタフェース詳細画面モデルデータに基づいて表示する。 According to the second aspect of the present invention, the control method of the plant operation training apparatus acquires the human model data indicating the person in the virtual space and displays the action of the avatar displayed in the virtual space based on the human model data. Operate to obtain plant model data indicating the plant in the virtual space, calculate the positional relationship between the avatar and the equipment included in the plant displayed in the virtual space based on the plant model data, and determine the positional relationship Interface detailed screen model data indicating the interface in the virtual space is acquired and displayed based on the interface detailed screen model data.
 本発明の第3の態様によれば、プログラムは、プラント運転訓練装置のコンピュータを仮想空間における人間を示す人間モデルデータを取得して当該人間モデルデータに基づいて仮想空間内に表示したアバターの動作を操作する手段、仮想空間におけるプラントを示すプラントモデルデータを取得して当該プラントモデルデータに基づいて仮想空間内に表示したプラントが備える機器類と前記アバターとの位置関係を算出する手段、前記位置関係について所定の条件を満たす前記機器類のインタフェース詳細画面を仮想空間におけるインタフェースを示すインタフェース詳細画面モデルデータを取得して当該インタフェース詳細画面モデルデータに基づいて表示する手段として機能させる。 According to the third aspect of the present invention, the program acquires the human model data indicating the person in the virtual space by the computer of the plant operation training apparatus, and displays the action of the avatar displayed in the virtual space based on the human model data. Means for obtaining plant model data indicating a plant in the virtual space and calculating a positional relationship between the avatar and the equipment included in the plant displayed in the virtual space based on the plant model data, the position The interface detail screen of the device that satisfies a predetermined condition for the relationship is made to function as means for acquiring interface detail screen model data indicating an interface in the virtual space and displaying it based on the interface detail screen model data.
 本発明の第4の態様によれば、プラント運転訓練システムは、仮想空間における人間を示す人間モデルデータを取得して当該人間モデルデータに基づいて仮想空間内に表示したアバターの動作を操作する操作手段と、仮想空間におけるプラントを示すプラントモデルデータを取得して当該プラントモデルデータに基づいて仮想空間内に表示したプラントが備える機器類と前記アバターとの位置関係を算出する位置関係算出部と、前記位置関係について所定の条件を満たす機器類のインタフェース詳細画面を仮想空間におけるインタフェースを示すインタフェース詳細画面モデルデータを取得して当該インタフェース詳細画面モデルデータに基づいて表示する詳細画面表示制御部と、前記詳細画面表示制御部が前記インタフェース詳細画面の表示を開始すると当該インタフェースに表示するパラメータの値を要求するパラメータ取得部と、を備えるプラント運転訓練装置と、プラントに備えられた機器類の状態を示すパラメータ値を算出し、前記要求に対して算出したパラメータ値を送信するプラントシミュレーション装置と、を備える。 According to the fourth aspect of the present invention, the plant operation training system acquires the human model data indicating a person in the virtual space and operates the operation of the avatar displayed in the virtual space based on the human model data. A positional relationship calculation unit that calculates the positional relationship between the avatar and the devices included in the plant that acquires the plant model data indicating the plant in the virtual space and displays the plant model data in the virtual space based on the plant model data; A detailed screen display control unit that acquires interface detailed screen model data indicating an interface in a virtual space and displays the interface detailed screen of devices that satisfy a predetermined condition for the positional relationship based on the interface detailed screen model data; and The detailed screen display control unit When the display is started, a parameter acquisition unit that requests a parameter value to be displayed on the interface, a plant operation training apparatus, and a parameter value indicating the state of the equipment included in the plant are calculated, and in response to the request A plant simulation device that transmits the calculated parameter value.
 上記した本発明の態様によれば、3D仮想空間内で、プラントに備えられた機器との位置関係を把握しながら現実の操作により近い運転訓練を行うことができるので、より高い訓練効果を得ることができる。 According to the above-described aspect of the present invention, driving training closer to the actual operation can be performed while grasping the positional relationship with the equipment provided in the plant in the 3D virtual space, thereby obtaining a higher training effect. be able to.
本発明の第一~五の実施形態によるプラント運転訓練装置の機能ブロック図である。FIG. 6 is a functional block diagram of a plant operation training apparatus according to first to fifth embodiments of the present invention. 本発明の第一~五の実施形態によるプラント運転訓練装置の操作画面における領域を示す図である。FIG. 6 is a diagram showing regions on the operation screen of the plant operation training apparatus according to the first to fifth embodiments of the present invention. 本発明の第一の実施形態によるプラント運転訓練装置の操作画面を示す図である。It is a figure which shows the operation screen of the plant operation training apparatus by 1st embodiment of this invention. 本発明の第一の実施形態によるプラント運転訓練装置の処理フローを示す図である。It is a figure which shows the processing flow of the plant operation training apparatus by 1st embodiment of this invention. 本発明の第一の実施形態によるプラント運転訓練装置の処理フローを説明するために用いる第一の図である。It is a 1st figure used in order to demonstrate the processing flow of the plant operation training apparatus by 1st embodiment of this invention. 本発明の第一の実施形態によるプラント運転訓練装置の処理フローを説明するために用いる第二の図である。It is a 2nd figure used in order to demonstrate the processing flow of the plant operation training apparatus by 1st embodiment of this invention. 本発明の第二の実施形態によるプラント運転訓練装置の操作画面を示す図である。It is a figure which shows the operation screen of the plant operation training apparatus by 2nd embodiment of this invention. 本発明の第二の実施形態によるプラント運転訓練装置の処理フローを示す図である。It is a figure which shows the processing flow of the plant operation training apparatus by 2nd embodiment of this invention. 本発明の第二の実施形態によるプラント運転訓練装置の処理フローを説明するために用いる図である。It is a figure used in order to demonstrate the processing flow of the plant operation training apparatus by 2nd embodiment of this invention. 本発明の第三の実施形態によるプラント運転訓練装置の設定画面を示す図である。It is a figure which shows the setting screen of the plant operation training apparatus by 3rd embodiment of this invention. 本発明の第四の実施形態によるプラント運転訓練装置の操作画面を示す第四の図である。It is a 4th figure which shows the operation screen of the plant operation training apparatus by 4th embodiment of this invention. 本発明の第四の実施形態によるプラント運転訓練装置の処理フローを示す第四の図である。It is a 4th figure which shows the processing flow of the plant operation training apparatus by 4th embodiment of this invention. 本発明の第五の実施形態によるプラント運転訓練装置のインタフェース詳細画面に表示するパラメータ値を示す第一の図である。It is a 1st figure which shows the parameter value displayed on the interface detailed screen of the plant operation training apparatus by 5th embodiment of this invention. 本発明の第五の実施形態によるプラント運転訓練装置のインタフェース詳細画面に表示するパラメータ値を示す第二の図である。It is a 2nd figure which shows the parameter value displayed on the interface detailed screen of the plant operation training apparatus by 5th embodiment of this invention. 本発明の第五の実施形態によるプラント運転訓練装置の処理フローを示す第五の図である。It is a 5th figure which shows the processing flow of the plant operation training apparatus by 5th embodiment of this invention.
<第一の実施形態>
 以下、本発明の一実施形態によるプラント運転訓練装置を図1~図5を参照して説明する。
 図1は第一~五の実施形態によるプラント運転訓練装置の概略機能ブロック図である。
 後述する第二~五の実施形態においても本ブロック図で説明する構成は同じである。
 この図において、符号1はプラント運転訓練装置を表している。プラント運転訓練装置1はプラントとユーザの分身であるアバターとをそれらの3Dモデルデータを使って3D仮想空間内に再現し、ユーザにプラントに備えられた機器の運転訓練を模擬的に行うことができる機能を提供する装置である。プラント運転訓練装置1は、この機能を実現するプログラムを実行することができる例えばPCやサーバ装置である。
<First embodiment>
Hereinafter, a plant operation training apparatus according to an embodiment of the present invention will be described with reference to FIGS.
FIG. 1 is a schematic functional block diagram of a plant operation training apparatus according to the first to fifth embodiments.
The configurations described in this block diagram are the same in the second to fifth embodiments described later.
In this figure, the code | symbol 1 represents the plant operation training apparatus. The plant operation training apparatus 1 can reproduce the plant and the avatar, which is a part of the user, in the 3D virtual space using those 3D model data, and simulate the operation training of the equipment provided in the plant for the user. It is a device that provides functions that can be performed. The plant operation training device 1 is, for example, a PC or a server device that can execute a program that realizes this function.
 符号2はプラントシミュレーション装置である。プラントシミュレーション装置2は、プラントに備えられた例えばガスタービンの出力、圧縮機内の圧力や温度などの状態量を計算してプラントの挙動を模擬する装置である。プラントシミュレーション装置はプラント運転訓練装置1とネットワークを介して接続している。プラント運転訓練装置1においてアバターが機器類に対して行った操作内容を示す信号を受信し、その内容に基づいてプラントの挙動を模擬する。操作内容を示す信号とは、例えばバルブの開閉やある制御機器の電源をON/OFFしたことを示す情報を含んでいる。プラントシミュレーション装置2は、算出した結果の状態量などをプラント運転訓練装置1へ送信する。ユーザは、プラント運転訓練装置1のアバターにプラントに備えられた表示器を確認させることを模擬した操作を行い、自分がアバターを介して行った操作によってどのような結果となるかを知る。このようにしてユーザはプラントに備えられた機器の運転を学習することができる。
 なお、プラントシミュレーション装置2は中央制御室を模擬した訓練用シミュレーション装置(図示せず)とも接続している。プラントシミュレーション装置2は中央制御室のシミュレーション装置を利用するユーザの操作を受信し同じようにプラントの挙動を模擬する。
 プラント運転訓練装置1はこのように構成されたプラント運転訓練システムの一部である。
Reference numeral 2 denotes a plant simulation apparatus. The plant simulation device 2 is a device that simulates the behavior of the plant by calculating state quantities such as an output of a gas turbine, a pressure in the compressor, and a temperature provided in the plant. The plant simulation apparatus is connected to the plant operation training apparatus 1 via a network. The plant operation training apparatus 1 receives a signal indicating the operation content performed by the avatar on the devices, and simulates the behavior of the plant based on the content. The signal indicating the operation content includes, for example, information indicating opening / closing of a valve and turning on / off of a certain control device. The plant simulation device 2 transmits the calculated state quantity and the like to the plant operation training device 1. The user performs an operation simulating causing the avatar of the plant operation training apparatus 1 to check the indicator provided in the plant, and knows what result is obtained by the operation performed by the user through the avatar. In this way, the user can learn the operation of the equipment provided in the plant.
The plant simulation apparatus 2 is also connected to a training simulation apparatus (not shown) that simulates the central control room. The plant simulation apparatus 2 receives a user operation using the simulation apparatus in the central control room and similarly simulates the behavior of the plant.
The plant operation training apparatus 1 is a part of the plant operation training system configured as described above.
 図1が示すようにプラント運転訓練装置1は表示部10、操作手段20、仮想空間生成部30、記憶部40、通信部50を備えている。
 表示部10は、PCなどのモニター、プロジェクターで投影するスクリーンなどである。ユーザは、表示部10に出力されたプラントにおける機器類の3D画像を見て訓練に必要な操作を行う。
 操作手段20は、マウス、ゲームコントローラ、ジョイスティック等である。ユーザは、これらの操作手段を用いてアバターの動作をコントロールする。本実施形態では、マウスだけでなく、ゲームコントローラやジョイスティックを利用した操作が可能である。それによりユーザは、3D仮想空間に表されたプラント内をスムーズに移動して、より臨場感を持って訓練を受けることができる。特に大型のスクリーンに3D画像を表示して訓練を受ける場合、手元とスクリーンに距離ができ、マウスによる操作が難しくなる。本実施形態においてプラント運転訓練装置1が提供する3D画像は、大型スクリーンを利用した場合にも操作しやすいように操作手段20を用いて対象を選択するのではなく、アバターがプラントにおける機器類に近付くと操作対象のインタフェース画面を表示させるような動作とし、ゲームコントローラを用いての操作がしやすいようになっている。
As shown in FIG. 1, the plant operation training apparatus 1 includes a display unit 10, an operation unit 20, a virtual space generation unit 30, a storage unit 40, and a communication unit 50.
The display unit 10 is a monitor such as a PC, a screen projected by a projector, or the like. The user looks at the 3D image of the equipment in the plant output to the display unit 10 and performs an operation necessary for training.
The operation means 20 is a mouse, a game controller, a joystick or the like. The user controls the operation of the avatar using these operation means. In this embodiment, an operation using not only a mouse but also a game controller or a joystick is possible. As a result, the user can smoothly move through the plant represented in the 3D virtual space and can receive training with a more realistic feeling. In particular, when training is performed by displaying a 3D image on a large screen, the distance between the hand and the screen is increased, which makes operation with a mouse difficult. In the present embodiment, the 3D image provided by the plant operation training device 1 is not an object selected using the operation means 20 so that it can be easily operated even when a large screen is used. When the user approaches, the operation target interface screen is displayed, and the operation using the game controller is facilitated.
 仮想空間生成部30は、3Dのプラントモデルデータ及び3Dの人間モデルデータを記憶部40から読み込んで、3D仮想空間にプラントの様子やアバターを再現する。また、操作手段20から操作信号を取得し、ユーザの操作に応じてアバターを動作させ、アバターが移動した場所の3D画像を生成し、表示部10に出力する。また、仮想空間生成部30は、位置関係算出部31、詳細画面表示制御部32、対応部位表示制御部33、詳細画面表示選択部34、詳細画面選択設定部35、パラメータ取得部36を備えている。 The virtual space generation unit 30 reads the 3D plant model data and the 3D human model data from the storage unit 40, and reproduces the state of the plant and the avatar in the 3D virtual space. In addition, an operation signal is acquired from the operation unit 20, the avatar is operated according to the user's operation, a 3D image of the place where the avatar has moved is generated, and is output to the display unit 10. The virtual space generation unit 30 includes a positional relationship calculation unit 31, a detailed screen display control unit 32, a corresponding part display control unit 33, a detailed screen display selection unit 34, a detailed screen selection setting unit 35, and a parameter acquisition unit 36. Yes.
 位置関係算出部31は、アバターとプラントに備えられた機器類との位置関係を算出する。プラントに備えられた機器類とは、タービンやボイラー等の大型設備、それら設備に備えられた操作器や表示器をいう。また位置関係とは、アバターと機器類との距離、機器類が備えられた閉鎖空間(部屋等)にアバターが存在するかどうか、又はアバターが機器類の方向を向いているかどうかを指す。 The positional relationship calculation unit 31 calculates the positional relationship between the avatar and the devices provided in the plant. The equipment provided in the plant refers to large facilities such as turbines and boilers, and operating devices and indicators provided in these facilities. The positional relationship refers to the distance between the avatar and the devices, whether the avatar exists in a closed space (such as a room) provided with the devices, or whether the avatar is facing the devices.
 詳細画面表示制御部32は、位置関係算出部31の算出結果が所定の条件を満たす場合、当該条件を満たす位置関係にある操作器や表示器のインタフェース詳細画面を所定の位置に表示する。インタフェース詳細画面とは、操作器や表示器のインタフェースをユーザが読み取ることができるように拡大した画像である。またインタフェースに表示される各種パラメータの値はプラントシミュレーション装置2から取得した値である。インタフェース詳細画面のモデルデータは記憶部40がそのインタフェースを有する機器が備えられている位置の位置情報と関連付けて記憶している。 When the calculation result of the positional relationship calculation unit 31 satisfies a predetermined condition, the detailed screen display control unit 32 displays an interface detailed screen of an operation device or a display device in a positional relationship that satisfies the predetermined condition at a predetermined position. The interface detail screen is an image enlarged so that the user can read the interface of the operation device or the display device. The values of various parameters displayed on the interface are values acquired from the plant simulation apparatus 2. The model data of the interface detail screen is stored in the storage unit 40 in association with the position information of the position where the device having the interface is provided.
 対応部位表示制御部33は、詳細画面表示制御部32が表示したインタフェース詳細画面が示すインタフェースを有する表示器や操作器が備えられた部位を他の機器類と異なる態様で表示する。他の機器類と異なる態様とは、当該部分だけを点滅させたり、目立つ色で表示したりすることをいう。また、対応部位表示制御部33は、表示器や操作器が備えられた大型設備機器自体も異なる態様で表示してもよい。 Corresponding part display control unit 33 displays a part provided with a display or an operation unit having an interface indicated by the interface detail screen displayed by detailed screen display control part 32 in a manner different from other devices. An aspect different from other devices means that only the part is blinked or displayed in a conspicuous color. In addition, the corresponding part display control unit 33 may display the large facility equipment itself provided with a display device and an operation device in a different manner.
 詳細画面表示選択部34は、詳細画面表示制御部32が表示するインタフェース詳細画面を選択することができる手段をユーザに提供する。具体的には詳細画面表示選択部34は、表示部10の所定の位置に複数機器のインタフェース詳細画面を重ねて表示し、最も前面に表示した詳細画面から順に表示する/しないを選択することをユーザに促す。詳細画面表示選択部34が提供するのは、プラントのある位置に計器類がたくさん備えられている場合などに不要な機器のインタフェース詳細画面を表示しないようする為の機能である。 The detailed screen display selection unit 34 provides a user with a means for selecting an interface detailed screen displayed by the detailed screen display control unit 32. Specifically, the detailed screen display selection unit 34 displays the interface detailed screens of a plurality of devices in a predetermined position on the display unit 10 and selects whether or not to display sequentially from the detailed screen displayed in the foreground. Prompt the user. The detailed screen display selection unit 34 provides a function for preventing display of unnecessary interface details screens of equipment when there are many instruments at a certain plant location.
 詳細画面選択設定部35は、詳細画面表示選択部34が表示する選択可能な機器のインタフェース詳細画面を特定し、その結果を設定できる機能を提供する。詳細画面選択設定部35が提供するのは、ユーザが利用しない機器のインタフェース詳細画面を表示しないよう設定する為の機能である。ユーザが利用しない機器とは、例えばプラント運転訓練を受けるユーザのレベルに合わせて訓練のインストラクターが不要と判断する機器、緊急時や非常時にしか使用してはならない機器である。 The detailed screen selection setting unit 35 provides a function of specifying an interface detail screen of selectable devices displayed by the detailed screen display selection unit 34 and setting the result. The detail screen selection setting unit 35 provides a function for setting not to display the interface detail screen of the device that is not used by the user. The equipment that is not used by the user is, for example, equipment that determines that a training instructor is unnecessary according to the level of the user who receives plant operation training, or equipment that should be used only in an emergency or emergency.
 パラメータ取得部36は、詳細画面表示制御部32がインタフェース詳細画面の表示を開始するときにプラントシミュレーション装置2へ当該インタフェースに表示すべきパラメータ値を要求する。パラメータ取得部36は、通信部50を介してパラメータ値を要求し、プラントシミュレーション装置2からパラメータ値を取得する。パラメータ値とはプラントシミュレーション装置2が算出したプラントの圧力や温度などの状態量である。プラントシミュレーション装置2はプラントに備えられた圧力計や温度計などの計器類が備えられた位置における圧力や温度などを計算し、パラメータ取得部36の要求に応じてそれら状態量をプラント運転訓練装置1へ送信する。そしてプラント運転訓練装置1では詳細画面表示制御部32が、パラメータ取得部36が取得したパラメータ値をセットした各機器類のインタフェース詳細画面を表示する。
 なお、パラメータ取得部36は、ユーザがアバターを介して操作器に対して操作を行うとその操作内容を示す信号を通信部50を介してプラントシミュレーション装置2へ送信する。
The parameter acquisition unit 36 requests the parameter value to be displayed on the interface from the plant simulation apparatus 2 when the detailed screen display control unit 32 starts displaying the interface detailed screen. The parameter acquisition unit 36 requests a parameter value via the communication unit 50 and acquires the parameter value from the plant simulation apparatus 2. The parameter value is a state quantity such as the pressure and temperature of the plant calculated by the plant simulation apparatus 2. The plant simulation apparatus 2 calculates the pressure, temperature, and the like at the position where the gauges and thermometers provided in the plant are provided, and the state quantity is obtained according to the request of the parameter acquisition unit 36. Send to 1. In the plant operation training apparatus 1, the detail screen display control unit 32 displays the interface detail screen of each device in which the parameter value acquired by the parameter acquisition unit 36 is set.
In addition, the parameter acquisition part 36 will transmit the signal which shows the operation content to the plant simulation apparatus 2 via the communication part 50, if a user operates with respect to an operating device via an avatar.
 記憶部40は、プラントの3Dモデルデータ、人間の3Dモデルデータ、機器類のインタフェース詳細画面モデルデータなどを記憶している。また、詳細画面選択設定部35による設定内容なども記憶している。なお、機器のインタフェース詳細画面のモデルデータは2Dの画像データであってよい。
 通信部50は、他装置との通信を行う。通信部50は、例えばプラント状態量のパラメータ値情報を送受信する。
 なお、仮想空間生成部30はプラント運転訓練装置1に備わるCPU(Central Processing Unit)がプログラムを実行することにより備わる機能である。
The storage unit 40 stores plant 3D model data, human 3D model data, equipment interface detail screen model data, and the like. In addition, the setting contents by the detailed screen selection setting unit 35 are also stored. Note that the model data on the device interface detail screen may be 2D image data.
The communication unit 50 communicates with other devices. The communication part 50 transmits / receives the parameter value information of the plant state quantity, for example.
The virtual space generation unit 30 is a function provided when a CPU (Central Processing Unit) provided in the plant operation training apparatus 1 executes a program.
 図2は本実施形態によるプラント運転訓練装置1の操作画面における各表示領域を示す図である。ここで説明する領域については第二~五の実施形態においても同じである。
 図2を用いてインタフェース詳細画面を表示する位置と意味について説明する。
 符号104は、詳細画面表示制御部32が機器類のインタフェース詳細画面を表示する領域である。以下、この領域のことを操作可能エリアと称する。操作可能エリア104に表示した機器のインタフェース詳細画面は、ユーザの操作対象となる。
 符号105は、詳細画面表示選択部34が、インタフェース詳細画面を重ねて表示する領域である。以下、この領域のことを選択エリアと称する。詳細画面表示制御部32は、詳細画面表示選択部34が選択エリア105に表示した機器の選択肢の中からユーザが選択した機器のインタフェース画像だけを操作可能エリアに表示する。
 符号106は、ユーザが選択しなかったインタフェース詳細画面を表示する領域である。以下、この領域のことを操作不可エリアと称する。
 符号107は、インタフェース詳細画面を示している。インタフェース画像には表示器や操作器のインタフェースが表示される。ユーザは、表示器のインタフェース詳細画面107を見てプラントの状態を知ることができる。ユーザは操作器のインタフェース詳細画面107に対して操作手段20を用いて機器の操作を行うことができる。
FIG. 2 is a diagram showing each display area on the operation screen of the plant operation training apparatus 1 according to the present embodiment. The regions described here are the same in the second to fifth embodiments.
The position and meaning of displaying the interface detail screen will be described with reference to FIG.
Reference numeral 104 denotes an area where the detailed screen display control unit 32 displays the interface details screen of the devices. Hereinafter, this area is referred to as an operable area. The device interface detail screen displayed in the operable area 104 is a user's operation target.
Reference numeral 105 denotes an area where the detail screen display selection unit 34 displays the interface detail screen in an overlapping manner. Hereinafter, this area is referred to as a selection area. The detailed screen display control unit 32 displays only the interface image of the device selected by the user from the device options displayed in the selection area 105 by the detailed screen display selection unit 34 in the operable area.
Reference numeral 106 denotes an area for displaying an interface detail screen that is not selected by the user. Hereinafter, this area is referred to as an operation impossible area.
Reference numeral 107 denotes an interface details screen. In the interface image, an interface of a display device or an operation device is displayed. The user can know the state of the plant by looking at the interface detail screen 107 of the display. The user can operate the device using the operation means 20 on the interface detail screen 107 of the operation device.
 図3は本実施形態によるプラント運転訓練装置1の操作画面を示す図である。
 図3を用いて詳細画面表示制御部32が位置関係算出部31の算出した位置関係に応じて操作器や表示器のインタフェース詳細画面107を表示する動作を説明する。
 符号100は、プラント内部のある場所を模擬した3D画像である。符号101はアバターを示している。ユーザの操作手段20の操作によって、アバター101は、プラント内の場所100へ至ったものとする。仮想空間生成部30は操作手段20から操作信号を取得すると、アバター101の操作前の位置を基準とする移動距離を計算する。移動距離は例えば所定のボタンを1回押下する毎に所定の距離を進むとして計算してもよい。そして仮想空間生成部30はアバター101が移動する度にアバターのプラントにおける位置情報を計算しメモリに出力する。また、仮想空間生成部30は、アバター101の移動先の位置情報に基づいて当該位置付近のプラントの景色を再現する。場所100は、このようにして仮想空間生成部30が生成した3D画像である。記憶部40が記憶するプラントの3Dモデルデータには機器類のプラントにおける位置情報が含まれている。
 位置関係算出部31は、アバター101が移動する度に、アバター101とプラントに備えられた機器類との距離をプラントにおけるそれぞれの位置情報を用いて計算する。
FIG. 3 is a diagram showing an operation screen of the plant operation training apparatus 1 according to the present embodiment.
An operation in which the detailed screen display control unit 32 displays the interface detail screen 107 of the operation device or the display device according to the positional relationship calculated by the positional relationship calculation unit 31 will be described with reference to FIG.
Reference numeral 100 is a 3D image simulating a certain place inside the plant. Reference numeral 101 denotes an avatar. It is assumed that the avatar 101 has reached the place 100 in the plant by the operation of the operation means 20 by the user. When the virtual space generation unit 30 acquires the operation signal from the operation unit 20, the virtual space generation unit 30 calculates the movement distance based on the position before the operation of the avatar 101. The movement distance may be calculated, for example, so as to advance a predetermined distance every time a predetermined button is pressed. The virtual space generation unit 30 calculates position information of the avatar in the plant every time the avatar 101 moves and outputs it to the memory. In addition, the virtual space generation unit 30 reproduces the scenery of the plant near the position based on the position information of the movement destination of the avatar 101. The place 100 is a 3D image generated by the virtual space generation unit 30 in this way. The plant 3D model data stored in the storage unit 40 includes position information of the equipment in the plant.
Each time the avatar 101 moves, the positional relationship calculation unit 31 calculates the distance between the avatar 101 and the devices provided in the plant using the respective position information in the plant.
 図3において場所100の付近には高圧給水流量計102と高圧給水弁後弁103が備えられているものとする。アバター101が場所100に至ったとき、アバター101とこれら高圧給水流量計102及び高圧給水弁後弁103との距離は所定の距離(第1閾値)以内であるものとする。
 すると、詳細画面表示制御部32は、高圧給水弁後弁と高圧給水流量計のインタフェース詳細画面のモデルデータを記憶部40から取得し、そのインタフェース詳細画面107を操作可能エリア104に表示する。高圧給水流量計102のインタフェース詳細画面107、高圧給水弁後弁103のインタフェース詳細画面107は操作可能エリア104にプラント3D画像の前面に重ねて表示される。
 ユーザはインタフェース詳細画面107から高圧給水流量計102がどのような値を示しているのかを把握することができる。ユーザはインタフェース詳細画面107に対して操作手段20を用いて操作することで高圧給水弁後弁103の開閉度を調整し、流量を変更することができる。
In FIG. 3, it is assumed that a high-pressure feed water flow meter 102 and a high-pressure feed valve rear valve 103 are provided in the vicinity of the place 100. When the avatar 101 reaches the place 100, it is assumed that the distance between the avatar 101, the high-pressure water supply flow meter 102, and the high-pressure water supply valve rear valve 103 is within a predetermined distance (first threshold).
Then, the detailed screen display control unit 32 acquires the model data of the interface detailed screen of the high-pressure feed valve rear valve and the high-pressure feed flow meter from the storage unit 40 and displays the interface detail screen 107 in the operable area 104. The interface detail screen 107 of the high-pressure feed water flow meter 102 and the interface detail screen 107 of the high-pressure feed valve rear valve 103 are displayed in the operable area 104 so as to overlap the front of the plant 3D image.
The user can grasp what value the high-pressure feed water flow meter 102 indicates from the interface detail screen 107. The user can adjust the opening / closing degree of the high-pressure water supply valve rear valve 103 and change the flow rate by operating the interface detail screen 107 using the operation means 20.
 これまでの3Dシミュレーション運転訓練装置においては、例えばバルブの開閉操作を行う場合、まずマウスをバルブの3D画像付近へ移動し、クリック操作によってバルブを選択し、さらにクリック操作等でバルブ開閉指示を行い、操作が完了したらバルブ画像ウインドウの右上端の×印をクリックしてウインドウを閉じるといった操作が必要である。
 しかしこのような操作が必要だと、現実のプラントにおいて実際のオペレータが得ることができるオペレータ自身と機器類との位置関係をユーザが把握することが難しく、ユーザは臨場感を感じにくくなって訓練効果にも影響があると考えられる。
 本実施形態によればユーザはアバター101を移動し、操作訓練対象である機器類に近づくだけで詳細画面表示制御部32が当該機器類のインタフェース詳細画面を表示するのでユーザは訓練に関係しない煩雑な操作を行う必要がない。従ってユーザは、機器類との位置関係が把握でき、より臨場感をもって運転訓練に取り組むことができる。
 なお、本実施形態においてはマウスだけでなくゲームコントローラを利用することが可能である。より現実感を伴うような3Dロールプレイングゲーム等が数多く提供される中、それらにより適した操作手段として開発されているゲームコントローラを用いることでプラント運転訓練装置1によるプラント内の3D仮想空間内の移動をスムーズにして、運転訓練をより現実の訓練に近づけることが可能である。
 また、ゲームコントローラを使用するとユーザが操作を行う機器を選択するような動作は行いにくくなる。しかし、上記の説明のとおりアバター101が機器類に近づくだけで機器のインタフェース詳細画面107を表示するので選択操作は必要ない。ゲームコントローラでアバター101を移動するには、例えばジョイスティックを用いる。
In the conventional 3D simulation driving training apparatus, for example, when performing a valve opening / closing operation, the mouse is first moved to the vicinity of the 3D image of the valve, the valve is selected by a click operation, and further a valve opening / closing instruction is performed by a click operation or the like. When the operation is completed, it is necessary to close the window by clicking the X mark at the upper right corner of the valve image window.
However, if such an operation is necessary, it is difficult for the user to grasp the positional relationship between the operator and the equipment that can be obtained by an actual operator in an actual plant, and the user is less likely to feel realism and train. The effect is also considered to be affected.
According to the present embodiment, the user moves the avatar 101, and the detailed screen display control unit 32 displays the interface details screen of the device just by approaching the device that is the subject of the operation training. There is no need to perform any special operations. Therefore, the user can grasp the positional relationship with the devices and can work on driving training with a more realistic feeling.
In this embodiment, it is possible to use not only a mouse but also a game controller. While many 3D role-playing games and the like that are more realistic are provided, a game controller developed as a more suitable operation means can be used in the 3D virtual space in the plant by the plant operation training apparatus 1. It is possible to move smoothly and bring driving training closer to real training.
Further, when the game controller is used, it is difficult to perform an operation in which the user selects a device to be operated. However, since the interface details screen 107 of the device is displayed only when the avatar 101 approaches the devices as described above, no selection operation is necessary. To move the avatar 101 with the game controller, for example, a joystick is used.
 図3ではアバター101が機器に近づくときの動作について説明した。以下においてアバター101が機器から離れる場合、詳細画面表示制御部32がインタフェース詳細画面107を非表示にする動作について説明する。アバター101が移動し機器から離れるときも、位置関係算出部31は両者の距離を計算する。そして詳細画面表示制御部32は、アバター101と機器との距離が第2閾値以上となると、操作可能エリア104に表示したインタフェース詳細画面107を非表示にする。
 ユーザは、アバター101をその機器から遠ざけるだけで使用しないインタフェース詳細画面107を非表示することができ、煩雑な操作を行なう必要がない。
 次にインタフェース詳細画面107に表示された操作器の操作について簡単に説明する。図3において現在、ユーザの所定の操作により高圧給水流量計102のインタフェース詳細画面107がアクティブになっているものとする。アクティブになっているとはユーザの操作を受け付ける状態にあることを意味している。この状態から高圧給水弁後弁103の操作を行うためにユーザは、例えばゲームコントローラの十字キーの右キーを押下することでアクティブになっているインタフェース詳細画面107を切り替える。すると高圧給水弁後弁103のインタフェース詳細画面107がアクティブになる。次にユーザは例えば十字キーの上下ボタンを用いて高圧給水弁後弁103の開閉操作を行う。
 本実施形態によれば、簡単な操作だけでアバターの移動、機器インタフェースの表示/非表示及び操作器の操作を行うことができる。
In FIG. 3, the operation when the avatar 101 approaches the device has been described. Hereinafter, an operation in which the detailed screen display control unit 32 hides the interface detailed screen 107 when the avatar 101 leaves the device will be described. Even when the avatar 101 moves and leaves the device, the positional relationship calculation unit 31 calculates the distance between them. The detailed screen display control unit 32 hides the interface detailed screen 107 displayed in the operable area 104 when the distance between the avatar 101 and the device is equal to or greater than the second threshold.
The user can hide the interface detail screen 107 that is not used simply by moving the avatar 101 away from the device, and does not need to perform complicated operations.
Next, the operation of the operation device displayed on the interface detail screen 107 will be briefly described. In FIG. 3, it is assumed that the interface detail screen 107 of the high-pressure feed water flow meter 102 is currently active by a user's predetermined operation. Active means that it is in a state of accepting a user operation. In order to operate the high-pressure water supply valve rear valve 103 from this state, the user switches the active interface detail screen 107 by pressing the right key of the cross key of the game controller, for example. Then, the interface detail screen 107 of the high pressure water supply valve rear valve 103 becomes active. Next, the user opens and closes the high-pressure water supply valve rear valve 103 using the up and down buttons of the cross key, for example.
According to the present embodiment, it is possible to move the avatar, display / hide the device interface, and operate the operation device with a simple operation.
 なお、上の例ではインタフェース詳細画面107の表示/非表示の判定をアバター101と機器との距離で判定したが、それに加えてアバター101の向きを判定に加えてもよい。例えば、アバター101の近くに備えられている機器であってもユーザの背面側に備えられている機器は表示しない。また、操作可能エリア104に表示されている機器であってもアバター101が別方向を向いたらインタフェース詳細画面107を非表示にしてもよい。これによってユーザは機器類が設置されている位置をより正確に学習することができる。
 また、機器類がある部屋に備えられている場合、アバター101の位置情報とそれら機器の備えられている部屋の位置情報とを比較してアバター101がその部屋に入ったらその部屋の備えられた機器類のインタフェース詳細画面107を操作可能エリア104に表示してもよい。また、アバター101が部屋から出るとインタフェース詳細画面107を非表示にしてもよい。
In the above example, the display / non-display determination of the interface detail screen 107 is determined by the distance between the avatar 101 and the device, but in addition to that, the orientation of the avatar 101 may be added to the determination. For example, even if the device is provided near the avatar 101, the device provided on the back side of the user is not displayed. Further, even if the device is displayed in the operable area 104, the interface detail screen 107 may be hidden when the avatar 101 faces in another direction. As a result, the user can learn the position where the devices are installed more accurately.
In addition, when the equipment is provided in a room, the position information of the avatar 101 is compared with the position information of the room in which the equipment is provided, and when the avatar 101 enters the room, the room is provided. The device interface detail screen 107 may be displayed in the operable area 104. Further, the interface detail screen 107 may be hidden when the avatar 101 leaves the room.
 図4は本実施形態による表示装置の処理フローを示す第一の図である。
 図5Aは、本発明の第一の実施形態によるプラント運転訓練装置の処理フローを説明するために用いる第一の図である。図5Bは、本発明の第一の実施形態によるプラント運転訓練装置の処理フローを説明するために用いる第二の図である。
 図4の処理フローを用いて図3で説明した処理について詳しく説明する。
 まず、位置関係算出部31は、プラント内の機器類(各操作器及び表示器)とその位置情報との組み合わせを記憶部40から取得する(ステップS1)。ここで位置情報とは3次元座標情報のことである。このとき位置関係算出部31は、プラント全体の機器類の位置情報を取得してもよいし、アバター101の移動する経路に基づいて、所定の距離内に備えられた機器類の位置情報だけを次々と取得してもよい。位置関係算出部31は、取得した機器類の総数をカウントしてメモリに格納する。
 図5Aは、ステップS1で位置関係算出部31が読み込むテーブルの一例を示した図である。このテーブルは記憶部40に保持されている。位置関係算出部31は、図5Aに示すテーブルから機器類の識別子が格納された「TAG」欄と機器類のプラントにおける位置情報を示す「X座標」「Y座標」、「Z座標」欄の値を取得してメモリに格納する。
FIG. 4 is a first diagram showing a processing flow of the display device according to the present embodiment.
FIG. 5A is a first diagram used for explaining the processing flow of the plant operation training apparatus according to the first embodiment of the present invention. FIG. 5B is a second diagram used for explaining the processing flow of the plant operation training apparatus according to the first embodiment of the present invention.
The processing described in FIG. 3 will be described in detail using the processing flow in FIG.
First, the positional relationship calculation part 31 acquires the combination of the apparatus (each operation device and indicator) in a plant, and its positional information from the memory | storage part 40 (step S1). Here, the position information is three-dimensional coordinate information. At this time, the positional relationship calculation unit 31 may acquire the position information of the devices of the entire plant, or only the position information of the devices provided within a predetermined distance based on the route along which the avatar 101 moves. You may acquire one after another. The positional relationship calculation unit 31 counts the total number of acquired devices and stores it in the memory.
FIG. 5A is a diagram illustrating an example of a table read by the positional relationship calculation unit 31 in step S1. This table is held in the storage unit 40. The positional relationship calculation unit 31 includes an “TAG” field in which the identifier of the device is stored from the table shown in FIG. 5A and an “X coordinate”, “Y coordinate”, and “Z coordinate” column that indicate the position information of the device in the plant. Get the value and store it in memory.
 次に仮想空間生成部30は、操作手段20からユーザの操作情報を取得し、アバター101を操作情報に応じた場所へ移動する。仮想空間生成部30は、記憶部40からプラントの3Dモデルデータを読み込んでアバター101が移動した位置から視界に入る景色を示す3D画像を生成する。そして仮想空間生成部30は、新たに生成した3D画像を表示部10に出力する。また、仮想空間生成部30は、アバター101が存在する場所の位置情報をメモリに出力する。位置関係算出部31は、アバター101のプラントにおける位置情報をメモリから取得する(ステップS2)。 Next, the virtual space generation unit 30 acquires user operation information from the operation means 20 and moves the avatar 101 to a place corresponding to the operation information. The virtual space generation unit 30 reads the 3D model data of the plant from the storage unit 40 and generates a 3D image showing a scene entering the field of view from the position where the avatar 101 has moved. Then, the virtual space generation unit 30 outputs the newly generated 3D image to the display unit 10. In addition, the virtual space generation unit 30 outputs position information of a place where the avatar 101 exists to the memory. The positional relationship calculation part 31 acquires the positional information in the plant of the avatar 101 from memory (step S2).
 次に位置関係算出部31は、ステップS1、S2で取得した位置情報を用いてステップS1で取得した機器(操作器又は表示器)のうちの1つ(機器i)についてアバター101と機器iとの距離である距離iを計算する(ステップS3)。この距離iは、アバター101の存在する場所の位置情報を(X1,Y1,Z1)とし、機器iの位置情報を(X2,Y2,Z2)とすると、{(X2-X1)+(Y2-Y1)+(Z2-Z1)}の平方根を計算することで求めることができる。
 次に位置関係算出部31は、ステップS3で計算した距離iと予め定められた第1閾値との大小を比較する(ステップS4)。距離iが第1閾値よりも小さい場合(ステップS4=Yes)、位置関係算出部31は、機器iの識別子を詳細画面表示制御部32へ出力する。第1閾値とはアバター101が機器iのインタフェース詳細画面107を表示するか否かを判定するのに用いる値である。
Next, the positional relationship calculation unit 31 uses the positional information acquired in steps S1 and S2 to determine the avatar 101 and the device i for one of the devices (operator or display) acquired in step S1 (device i). The distance i which is the distance of is calculated (step S3). The distance i is {(X2-X1) 2 + (Y2), where (X1, Y1, Z1) is the position information of the place where the avatar 101 exists and (X2, Y2, Z2) is the position information of the device i. It can be obtained by calculating the square root of -Y1) 2 + (Z2-Z1) 2 }.
Next, the positional relationship calculation unit 31 compares the distance i calculated in step S3 with a predetermined first threshold (step S4). When the distance i is smaller than the first threshold (step S4 = Yes), the positional relationship calculation unit 31 outputs the identifier of the device i to the detailed screen display control unit 32. The first threshold is a value used to determine whether or not the avatar 101 displays the interface detail screen 107 of the device i.
 詳細画面表示制御部32は取得した識別子が示す機器iのインタフェース詳細画面107が既に表示されているかどうかを判定する(ステップS5)。
 図5Bを用いてステップS5の判定方法の一例について説明する。図5Bは、詳細画面表示制御部32が機器iのインタフェース詳細画面107が既に表示されているかどうかを判定する際に参照するテーブルの一例を示している。この図において「表示済みフラグ」欄の値が「0」であれば、そのレコードの「TAG」欄を識別子とする機器のインタフェース詳細画面107は操作可能領域に表示されていないことを示す。値が「1」であれば、その機器のインタフェース画像は操作可能領域に表示済みであることを示す。
 図4の処理フローに戻る。機器iのインタフェース詳細画面107が既に表示されている場合(ステップS5=Yes)、詳細画面表示制御部32は機器iのインタフェース詳細画面107は表示しない。機器iのインタフェース詳細画面107が既に表示されている場合とは、アバター101が機器iに近づいてその周辺(後述する第2閾値を超えない範囲)を行ったり来たりしたような場合が考えられる。
 機器iのインタフェース画像が未だ表示されてない場合(ステップS5=No)、詳細画面表示制御部32は機器iの識別子に対応する操作器や表示器のインタフェース詳細画面107を示す画像データを記憶部40から読み込んで表示部10の操作可能領域に表示する(ステップS6)。例えばこのとき詳細画面表示制御部32は、図5Aのテーブルを識別子(「TAG」)を用いて読込み、該当するレコードの「インタフェース画像データ」欄の値を取得する。そして、詳細画面表示制御部32は、この値が示す画像データを特定し、特定した画像データに基づいて機器iのインタフェース詳細画面107を生成する。詳細画面表示制御部32はこれらの処理を完了すると、位置関係算出部31に処理が完了した旨の信号を出力する。
The detailed screen display control unit 32 determines whether the interface detailed screen 107 of the device i indicated by the acquired identifier has already been displayed (step S5).
An example of the determination method in step S5 will be described with reference to FIG. 5B. FIG. 5B shows an example of a table that is referred to when the detailed screen display control unit 32 determines whether the interface detailed screen 107 of the device i has already been displayed. In this figure, if the value of the “displayed flag” field is “0”, it indicates that the interface detail screen 107 of the device having the “TAG” field of the record as an identifier is not displayed in the operable area. A value of “1” indicates that the interface image of the device has already been displayed in the operable area.
Returning to the processing flow of FIG. When the interface detail screen 107 of the device i has already been displayed (step S5 = Yes), the detail screen display control unit 32 does not display the interface detail screen 107 of the device i. The case where the interface detail screen 107 of the device i is already displayed may be a case where the avatar 101 approaches the device i and moves back and forth (the range not exceeding the second threshold described later). .
When the interface image of the device i has not been displayed yet (step S5 = No), the detailed screen display control unit 32 stores image data indicating the interface details screen 107 of the operation unit or display corresponding to the identifier of the device i. The data is read from 40 and displayed in the operable area of the display unit 10 (step S6). For example, at this time, the detailed screen display control unit 32 reads the table of FIG. 5A using the identifier (“TAG”), and acquires the value of the “interface image data” column of the corresponding record. Then, the detailed screen display control unit 32 specifies the image data indicated by this value, and generates the interface detailed screen 107 of the device i based on the specified image data. When the detailed screen display control unit 32 completes these processes, it outputs a signal indicating that the process has been completed to the positional relationship calculation unit 31.
 次に位置関係算出部31は、ステップS1で取得した全ての操作器及び表示器について位置関係(距離)の計算が完了したかどうかを判定する(ステップS10)。全ての機器について計算が完了している場合(ステップS10=Yes)、本処理フローは終了する。そうでない場合(ステップS10=No)は、次の機器についてステップS3からの処理を繰り返す。
 これまでにステップS4においてアバター101の位置と機器iの位置との距離が第1閾値より小さい場合について説明を行った。以下に次に本処理フローで処理対象とする機器i+1についてアバター101と機器i+1との距離が第1閾値以上である場合(ステップS4=No)について説明する。この場合、詳細画面表示制御部32は機器i+1のインタフェース詳細画面107は表示しない。
 位置関係算出部31は、アバター101と機器i+1との距離である距離i+1と第2閾値との大小を比較する(ステップS7)。第2閾値とはアバター101が機器iから遠ざかったときにインタフェース詳細画面107を非表示にするか否かを判定するのに用いる値である。
 距離i+1が第2閾値より小さい場合(ステップS7=No)、機器i+1についての処理を完了し(機器i+1については特に何もしない)、ステップS10の判定を行う。
距離i+1が第2閾値以上のとき(ステップS7=Yes)、位置関係算出部31は、機器i+1の識別子を示す情報を詳細画面表示制御部32に出力する。
Next, the positional relationship calculation unit 31 determines whether or not the calculation of the positional relationship (distance) has been completed for all the operating devices and displays acquired in step S1 (step S10). When the calculation has been completed for all the devices (step S10 = Yes), this processing flow ends. When that is not right (step S10 = No), the process from step S3 is repeated about the following apparatus.
The case where the distance between the position of the avatar 101 and the position of the device i is smaller than the first threshold in step S4 has been described so far. The case where the distance between the avatar 101 and the device i + 1 is equal to or larger than the first threshold for the device i + 1 to be processed in the present processing flow (step S4 = No) will be described below. In this case, the detailed screen display control unit 32 does not display the interface detailed screen 107 of the device i + 1.
The positional relationship calculation unit 31 compares the distance i + 1, which is the distance between the avatar 101 and the device i + 1, with the second threshold value (step S7). The second threshold is a value used to determine whether or not to hide the interface detail screen 107 when the avatar 101 moves away from the device i.
When the distance i + 1 is smaller than the second threshold (step S7 = No), the process for the device i + 1 is completed (the device i + 1 is not particularly performed), and the determination in step S10 is performed.
When the distance i + 1 is equal to or greater than the second threshold (step S7 = Yes), the positional relationship calculation unit 31 outputs information indicating the identifier of the device i + 1 to the detailed screen display control unit 32.
 次に詳細画面表示制御部32は取得した識別子が示す機器i+1のインタフェース詳細画面107が既に操作可能エリアに表示されているかどうかを判定する(ステップS8)。判定の方法はステップS5と同様でよい。機器iのインタフェース画像が表示されていない場合(ステップS8=No)、位置関係算出部31に処理の完了を通知する。機器i+1のインタフェース画像が表示されている場合(ステップS8=Yes)、詳細画面表示制御部32は、操作可能エリアに表示された機器i+1の識別子に対応する操作器や表示器のインタフェース詳細画面107を非表示にする処理を行う(ステップS9)。そして、詳細画面表示制御部32はこれらの処理を完了すると、位置関係算出部31に処理が完了した旨の信号を出力する。
 次に位置関係算出部31はステップS10の判定を行う。以上のようにして全ての機器についてインタフェース画像の表示/非表示を制御が完了すると本処理フローは終了する。
Next, the detailed screen display control unit 32 determines whether or not the interface detailed screen 107 of the device i + 1 indicated by the acquired identifier is already displayed in the operable area (step S8). The determination method may be the same as in step S5. When the interface image of the device i is not displayed (step S8 = No), the positional relationship calculation unit 31 is notified of the completion of the process. When the interface image of the device i + 1 is displayed (step S8 = Yes), the detailed screen display control unit 32 displays the interface details screen 107 of the operation device or display device corresponding to the identifier of the device i + 1 displayed in the operable area. Is performed (step S9). When the detailed screen display control unit 32 completes these processes, it outputs a signal indicating that the process has been completed to the positional relationship calculation unit 31.
Next, the positional relationship calculation part 31 performs determination of step S10. When the control for displaying / hiding interface images for all devices is completed as described above, this processing flow ends.
<第二の実施の形態>
 以下、本発明の第二の実施形態によるプラント運転訓練装置の動作を図6~8を参照して説明する。
 第二の実施形態は、操作可能エリア104に機器のインタフェース詳細画面107を表示する前にユーザが操作可能エリア104に表示する機器の選択を行う点が第一の実施形態と異なる。
 図6は本実施形態によるプラント運転訓練装置1の操作画面を示す図である。
 図6を用いて詳細画面表示選択部34がユーザにインタフェース詳細画面107を選択することができる手段を提供する方法について説明する。
 仮想空間生成部30が操作手段20からの信号に基づいてアバターを場所100まで移動すると詳細画面表示選択部34はアバター101から第1閾値以内の距離に備えられた機器のインタフェース詳細画面107を選択エリア105に表示する。図6は、アバター101がプラントの場所100に存在するときの3D画像である。図6においてアバター101から第1閾値よりも小さい距離内に高圧給水弁後弁103と高圧給水流量計102が備えられているものとする。このとき詳細画面表示選択部34は、表示部10の選択エリア105に高圧給水流量計102のインタフェース詳細画面107と高圧給水弁後弁103のインタフェース詳細画面107を重ねて表示する。また、詳細画面表示選択部34は、最も前面に表示した機器のインタフェース詳細画面107の外枠にユーザからの操作を受け付ける対象であることを示す例えば赤枠111を表示する。
<Second Embodiment>
The operation of the plant operation training apparatus according to the second embodiment of the present invention will be described below with reference to FIGS.
The second embodiment differs from the first embodiment in that the user selects a device to be displayed in the operable area 104 before displaying the device interface detail screen 107 in the operable area 104.
FIG. 6 is a diagram showing an operation screen of the plant operation training apparatus 1 according to the present embodiment.
A method for providing a means by which the detailed screen display selection unit 34 can select the interface detailed screen 107 to the user will be described with reference to FIG.
When the virtual space generation unit 30 moves the avatar to the place 100 based on the signal from the operation means 20, the detailed screen display selection unit 34 selects the interface detail screen 107 of the device provided at a distance within the first threshold from the avatar 101. Display in area 105. FIG. 6 is a 3D image when the avatar 101 is present at the plant location 100. In FIG. 6, it is assumed that the high pressure feed valve rear valve 103 and the high pressure feed water flow meter 102 are provided within a distance smaller than the first threshold value from the avatar 101. At this time, the detail screen display selection unit 34 displays the interface detail screen 107 of the high-pressure feed water flow meter 102 and the interface detail screen 107 of the high-pressure feed valve rear valve 103 in the selection area 105 of the display unit 10 in an overlapping manner. Further, the detailed screen display selection unit 34 displays, for example, a red frame 111 indicating that it is a target for accepting an operation from the user on the outer frame of the interface detail screen 107 of the device displayed on the forefront.
 ユーザがこの状態で機器を選択する操作を行うと、詳細画面表示制御部32はそのインタフェース詳細画面107を操作可能エリアに移動表示する。また、ユーザが機器を選択しない操作を行うと、詳細画面表示選択部34はそのインタフェース詳細画面107を選択エリアから消去し、その下に重ねて表示されていた別のインタフェース詳細画面107を最前面に表示する。ここで所定の操作とは、例えばゲームコントローラの所定のボタンを押下するでもよいし、マウスでクリック操作を行うでもよい。
 また、ユーザがその機器を選択しないことを示す別の所定の操作を行うと詳細画面表示選択部34は、そのインタフェース詳細画面107を選択エリア105から消去し、次のインタフェース詳細画面107を選択エリア105の最前面に表示する。
 ユーザが選択した結果は、詳細画面表示選択部34が記憶部40へ格納する。
When the user performs an operation of selecting a device in this state, the detailed screen display control unit 32 moves and displays the interface detailed screen 107 in the operable area. Further, when the user performs an operation without selecting a device, the detailed screen display selection unit 34 deletes the interface detail screen 107 from the selection area, and displays another interface detail screen 107 displayed below the frontmost screen. To display. Here, the predetermined operation may be, for example, pressing a predetermined button of the game controller or performing a click operation with a mouse.
Further, when the user performs another predetermined operation indicating that the device is not selected, the detail screen display selection unit 34 deletes the interface detail screen 107 from the selection area 105 and displays the next interface detail screen 107 in the selection area. 105 is displayed on the foreground.
The result selected by the user is stored in the storage unit 40 by the detailed screen display selection unit 34.
 このようにユーザが必要としない機器を操作可能エリア104に表示しないようにすることで、操作可能エリア104に表示されるインタフェース詳細画面107の数を減らすことができる。それによって、ユーザは操作可能エリア104に表示されたインタフェース詳細画面107の中から操作を行う機器を選択する手間を少なくすることができる。
 また、一度選択した内容を記憶部40に記憶させることでアバター101が再度同じ位置に移動してきたときに再度選択することなく必要な機器のインタフェース詳細画面107だけを表示させることができる。
 また、詳細画面表示選択部34が選択エリア105にてインタフェース詳細画面107を1つずつ重ねて表示し、選択が完了する度に同じ位置に別の機器の画像を次々と表示することで、ユーザはマウス等で選択する対象となる機器を選択する必要がない。
 なお、再度選択を行いたい場合などに所定の操作キーを押下すると詳細画面表示選択部34が選択エリア105にインタフェース詳細画面107を表示する動作としてもよい。
 また、上の説明ではユーザが選択しない操作を行ったときはそのインタフェース詳細画面107を非表示とする動作としたが、非表示とする代わりに操作不可エリア106に移動してインタフェース詳細画面107を重ねて表示する動作でもよい。この場合、操作不可エリア106に表示されたインタフェース詳細画面107を参照することで、ユーザは他にどのような機器が備えられているのかを把握することができる。さらに、操作不可エリア106において選択操作ができるような動作としてもよい。ユーザは一度不要と判断した機器について再度選択を行うことによって、その機器のインタフェース詳細画面107を操作可能エリア104に表示させることができる。
Thus, by not displaying devices that are not required by the user in the operable area 104, the number of interface detail screens 107 displayed in the operable area 104 can be reduced. As a result, the user can reduce the trouble of selecting a device to be operated from the interface detail screen 107 displayed in the operable area 104.
Further, by storing the contents once selected in the storage unit 40, it is possible to display only the necessary interface detail screen 107 without selecting again when the avatar 101 moves to the same position again.
In addition, the detailed screen display selection unit 34 displays the interface detail screens 107 one by one in the selection area 105 and displays images of different devices one after another at the same position each time selection is completed. Does not need to select a device to be selected with a mouse or the like.
Note that the detailed screen display selection unit 34 may display the interface detail screen 107 in the selection area 105 when a predetermined operation key is pressed when the user wants to select again.
In the above description, when the user performs an operation that is not selected, the interface detail screen 107 is hidden. However, instead of being hidden, the interface detail screen 107 is moved to the non-operational area 106 and displayed. It is also possible to display the images in a superimposed manner. In this case, referring to the interface detail screen 107 displayed in the non-operational area 106, the user can grasp what other devices are provided. Furthermore, it is good also as operation | movement which can perform selection operation in the operation impossible area 106. FIG. The user can display the interface detail screen 107 of the device in the operable area 104 by selecting again the device that is determined to be unnecessary once.
 図7は本実施形態による表示装置の処理フローを示す図である。
 図7の処理フローを用いて図6で説明した処理について詳しく説明する。
前提として、ユーザの指示によりアバター101は移動し、プラント内の場所100に存在するものとする。
 すると、図4のフローで説明したとおり位置関係算出部31はアバター101の位置情報と機器類の位置情報とをメモリから取得しこれらの距離を計算し、アバター101との距離が第1閾値より小さい位置にある機器類を特定する。そして位置関係算出部31は、これら特定した機器の識別子を示す情報を詳細画面表示選択部34へ出力する。次に詳細画面表示選択部34は、取得した機器の識別子を用いて記憶部40からインタフェース詳細画面を読み込み、それらインタフェース詳細画面を選択エリア105に重ねて表示する(ステップS11)。重ねる順番は例えば識別子順でもよいし、あるいはアバター101との距離が近いものをより前面に表示してもよい。
FIG. 7 is a diagram showing a processing flow of the display device according to the present embodiment.
The processing described in FIG. 6 will be described in detail using the processing flow of FIG.
As a premise, it is assumed that the avatar 101 moves in accordance with a user instruction and exists at a location 100 in the plant.
Then, as described in the flow of FIG. 4, the positional relationship calculation unit 31 acquires the position information of the avatar 101 and the position information of the devices from the memory, calculates these distances, and the distance to the avatar 101 is greater than the first threshold. Identify equipment in a small location. Then, the positional relationship calculation unit 31 outputs information indicating the identifiers of the specified devices to the detailed screen display selection unit 34. Next, the detailed screen display selection unit 34 reads the interface detailed screen from the storage unit 40 using the acquired identifier of the device, and displays the interface detailed screen on the selection area 105 (step S11). The order of overlapping may be, for example, in the order of identifiers, or may be displayed on the front side closer to the avatar 101.
 ステップS11について詳しく説明する。ステップS11において詳細画面表示選択部34は、例えば図8に示すテーブルを用いる。このテーブルの「操作可能フラグ」欄はその識別子を持つ機器について操作エリアに「表示する/しない」又はそのどちらも設定されていないことを示す値がセットされている。「操作可能フラグ」欄の値が「0」であれば、その機器については全く未設定であることを示す。値が「1」であれば、その機器については「操作エリアに表示する」と設定されていることを示している。また、値が「2」であればその機器については「操作エリアに表示しない」と設定されていることを示している。
 ステップS11で詳細画面表示選択部34は、このテーブルを読み込み、「操作可能フラグ」の値が「0」でとなっている機器を選択し、それらの機器についてのみ選択エリアにインタフェース画像を表示する。
 なお、フラグの値が「1」の機器については、詳細画面表示選択部34がそれらの識別子情報を詳細画面表示制御部32へ出力し、詳細画面表示制御部32が操作可能エリアへ出力するものとする。フラグの値が「2」の機器は、表示しない。
 図8の例の場合、TAG「A-101」で示される機器のインタフェース画像は詳細画面表示選択部34が選択エリアに表示する。TAG「B-101」で示される機器の画像インタフェースは詳細画面表示制御部32が操作可能エリアに表示する。そしてTAG「C-102」で示される機器のインタフェース画像は既に表示しない設定がなされているので何処にも表示されない。
Step S11 will be described in detail. In step S11, the detailed screen display selection unit 34 uses, for example, a table shown in FIG. In the “operable flag” column of this table, a value indicating that “display / do not display” or neither of them is set in the operation area for the device having the identifier is set. If the value in the “operable flag” field is “0”, it indicates that the device has not been set at all. A value of “1” indicates that “display in the operation area” is set for the device. A value of “2” indicates that the device is set to “not display in the operation area”.
In step S11, the detailed screen display selection unit 34 reads this table, selects devices whose “operation possible flag” value is “0”, and displays an interface image in the selection area only for those devices. .
For devices whose flag value is “1”, the detailed screen display selection unit 34 outputs the identifier information to the detailed screen display control unit 32, and the detailed screen display control unit 32 outputs the identifier information to the operable area. And Devices whose flag value is “2” are not displayed.
In the example of FIG. 8, the detailed screen display selection unit 34 displays the interface image of the device indicated by the TAG “A-101” in the selection area. The detailed screen display control unit 32 displays the image interface of the device indicated by the TAG “B-101” in the operable area. The interface image of the device indicated by the TAG “C-102” has already been set not to be displayed, so it is not displayed anywhere.
 次に詳細画面表示選択部34は、一番前に表示したインタフェース詳細画面107がユーザの選択操作の対象となるように制御を行う。具体的には詳細画面表示選択部34は、当該インタフェース画像にフォーカスを当ててアクティブ状態にする(ステップS12)。また、詳細画面表示選択部34は、ユーザがわかり易いようにインタフェース詳細画面107の外周部分に赤色の枠を表示してもよい。
 次に詳細画面表示選択部34は、操作手段20からのユーザによる選択操作を示す信号を検出し、ユーザが機器iのインタフェース画像を操作可能エリアに表示するよう選択したか否かを判定する(ステップS13)。
 詳細画面表示選択部34が機器iを選択する旨の信号を取得した場合、詳細画面表示選択部34は機器iの識別子を詳細画面表示制御部32へ出力する。次に詳細画面表示制御部32は取得した識別子に基づいて機器iのインタフェース詳細画面107を選択エリア105から操作可能エリア104へ移動し表示する(ステップS14)。そして詳細画面表示選択部34は機器iについて操作可能エリア104に「表示する」という選択がされたことを記憶部40に記録する。例えば、図8の例の場合、機器iを示すTAGの値を持つレコードの「操作可能フラグ」欄に「1」をセットする。
 詳細画面表示選択部34が機器iを選択しない旨の信号を取得した場合、詳細画面表示選択部34は、機器iのインタフェース画像を非表示にする(ステップS15)。そして詳細画面表示選択部34は機器iについて操作可能エリアに「表示しない」との選択がされたことを記憶部40に記録する。例えば、図8の例の場合、機器iを示すTAGの値を持つレコードの「操作可能フラグ」欄に「2」をセットする。
Next, the detail screen display selection unit 34 performs control so that the interface detail screen 107 displayed at the forefront becomes the target of the user's selection operation. Specifically, the detailed screen display selection unit 34 focuses on the interface image to make it active (step S12). Further, the detailed screen display selection unit 34 may display a red frame on the outer peripheral portion of the interface detail screen 107 so that the user can easily understand.
Next, the detailed screen display selection unit 34 detects a signal indicating the selection operation by the user from the operation unit 20 and determines whether or not the user has selected to display the interface image of the device i in the operable area ( Step S13).
When the detailed screen display selection unit 34 acquires a signal indicating that the device i is selected, the detailed screen display selection unit 34 outputs the identifier of the device i to the detailed screen display control unit 32. Next, the detailed screen display control unit 32 moves and displays the interface detailed screen 107 of the device i from the selection area 105 to the operable area 104 based on the acquired identifier (step S14). Then, the detailed screen display selection unit 34 records in the storage unit 40 that “display” is selected in the operable area 104 for the device i. For example, in the case of the example of FIG. 8, “1” is set in the “operation possible flag” column of the record having the TAG value indicating the device i.
When the detailed screen display selection unit 34 acquires a signal indicating that the device i is not selected, the detailed screen display selection unit 34 hides the interface image of the device i (step S15). Then, the detailed screen display selection unit 34 records in the storage unit 40 that “not display” is selected in the operable area for the device i. For example, in the case of the example of FIG. 8, “2” is set in the “operation possible flag” field of the record having the TAG value indicating the device i.
 次に詳細画面表示選択部34は、ステップS11で選択エリアに表示した機器の全てについてユーザによる表示する/しないの選択操作が完了したかどうかを判定する(ステップS16)。全ての機器について選択が完了している場合(ステップS16=Yes)、本処理フローは終了する。そうでない場合(ステップS16=No)は、次のインタフェース詳細画面107についてステップS12からの処理を繰り返す。
 そして全ての機器について処理が終了すると本処理フローを終了する。
Next, the detailed screen display selection unit 34 determines whether or not the user has selected whether or not to display all the devices displayed in the selection area in step S11 (step S16). If selection has been completed for all the devices (step S16 = Yes), this processing flow ends. When that is not right (step S16 = No), the process from step S12 is repeated about the following interface detail screen 107. FIG.
Then, when processing is completed for all devices, this processing flow ends.
<第三の実施の形態>
 以下、本発明の第三の実施形態によるプラント運転訓練装置の動作を図9を参照して説明する。
 第三の実施形態は、選択エリア105に表示する機器の種類を予めユーザが設定しておく点が第二の実施形態と異なる。
 図9は本実施形態によるプラント運転訓練装置1の設定画面を示す図である。
 図9はプラント運転訓練のインストラクター役のユーザが選択エリア105にインタフェース詳細画面107を表示する機器を設定できる設定画面の一例を示している。
 符号110は、設定画面を示している。符号112は、機器ごとに選択エリア105への表示/非表示を設定することができるチェックボックス群を示している。ユーザがチェックを付したチェックボックス112aに対応する機器「操作器1」は、詳細画面表示選択部34が選択エリアに表示する対象となる機器である。ユーザがチェックを付していないチェックボックス112bに対応する機器「表示器3」は、詳細画面表示選択部34が選択エリアに表示しない機器である。
 なお、「操作器1」にチェックが付されていても、ユーザが「操作器1」について操作可能エリア104に表示する/しないの設定をした後であれば詳細画面表示選択部34は「操作器1」のインタフェース詳細画面107を選択エリア105に表示しない。
<Third embodiment>
The operation of the plant operation training apparatus according to the third embodiment of the present invention will be described below with reference to FIG.
The third embodiment is different from the second embodiment in that the user previously sets the type of device to be displayed in the selection area 105.
FIG. 9 is a diagram showing a setting screen of the plant operation training apparatus 1 according to the present embodiment.
FIG. 9 shows an example of a setting screen on which a user who acts as an instructor for plant operation training can set a device for displaying the interface detail screen 107 in the selection area 105.
Reference numeral 110 indicates a setting screen. Reference numeral 112 denotes a check box group that can set display / non-display in the selection area 105 for each device. The device “operator 1” corresponding to the check box 112a checked by the user is a device to be displayed in the selection area by the detailed screen display selection unit 34. The device “display 3” corresponding to the check box 112b not checked by the user is a device that the detailed screen display selection unit 34 does not display in the selection area.
Even if the “operator 1” is checked, the detailed screen display selection unit 34 may be configured to “operate” after the user has set whether or not to display the “operator 1” in the operable area 104. The interface detail screen 107 of “device 1” is not displayed in the selection area 105.
 符号108はOKボタンを示している。ユーザがチェックボックス112にチェックを付した後にOKボタン108を押下すると、詳細画面選択設定部35は、ユーザがこの画面において設定した内容を記憶部40の例えば図8で示したテーブルに格納する。そして設定画面110を閉じる。
 図8に示したテーブルを例に詳細画面選択設定部35が設定値をテーブルに格納する動作の一例を説明する。
 ユーザが新たにチェックを付した機器については以下のように値をセットしてもよい。元々、その機器に対して「操作可能フラグ」の値が「0」又は「1」である場合、そのままの値を維持する。また、値が「2」である場合、詳細画面選択設定部35は、「0」をセットする。元々「操作可能フラグ」欄の値が「2」の場合、その機器については操作可能エリアに表示しないという設定がされていたことを意味する。しかし、そのような機器について設定画面110において改めて「表示」のチェックボックスにチェックを付するということはユーザに再度、操作選択エリアに表示する/しないの選択の機会を与えることが適切と思われるからである。
 また、ユーザが今までチェックを付していた機器からチェックを外した場合については、詳細画面選択設定部35は、「操作可能フラグ」欄に「2」をセットする。「2」をセットすることでその機器のインタフェース画像は選択エリアにも操作可能エリアにも表示されない。
Reference numeral 108 denotes an OK button. When the user depresses the OK button 108 after checking the check box 112, the detailed screen selection setting unit 35 stores the content set by the user on this screen in the table shown in FIG. Then, the setting screen 110 is closed.
An example of the operation in which the detailed screen selection setting unit 35 stores the setting value in the table will be described using the table shown in FIG.
For devices newly checked by the user, values may be set as follows. Originally, when the value of the “operation possible flag” is “0” or “1” for the device, the value is maintained as it is. When the value is “2”, the detailed screen selection setting unit 35 sets “0”. If the value in the “operable flag” column is “2” originally, it means that the device has been set not to be displayed in the operable area. However, it is considered appropriate to check the “display” check box on the setting screen 110 for such a device again to give the user an opportunity to select whether or not to display in the operation selection area. Because.
Further, when the user removes the check from the device that has been checked up to now, the detailed screen selection setting unit 35 sets “2” in the “operation possible flag” field. By setting “2”, the interface image of the device is not displayed in the selection area or the operable area.
 符号109はキャンセルボタンを示している。ユーザがキャンセルボタン109を押下すると詳細画面選択設定部35は、記憶部40のテーブルを更新せずに設定画面110を閉じる。 Numeral 109 indicates a cancel button. When the user presses the cancel button 109, the detailed screen selection setting unit 35 closes the setting screen 110 without updating the table in the storage unit 40.
 このような設定手段を設けることで選択エリアに表示するインタフェース画像の数を減らし、運転訓練を受けるユーザが必要な機器を選択する手間を省くことができる。
 この機能は、例えば運転訓練のインストラクターが訓練を受けるユーザの訓練内容に合わせて表示させる機器を選択するために用いることができる。例えば訓練を受けるユーザのレベルが初心者の場合、必要最低限の機器類のインタフェース画像のみを表示することが望ましいが、本実施形態の設定を用いることでそれが可能である。また、普段ほとんど使わない機器についてもこの機能によって表示を制限することができる。
 なお、本実施形態は第一の実施形態と組み合わせて用いることも可能である。その場合、例えば以下のように制御する。詳細画面表示制御部32は詳細画面選択設定部35が機器ごとに表示する/しないを記録したテーブルを参照し、アバターとの距離が第1閾値より小さくなるとこのテーブルに「表示する」と設定された機器の画像のみを表示する。第一の実施形態と組み合わせた場合、選択エリアから必要な機器を選択する知識のないユーザに対して必要な機器のインタフェース画像を表示させることができる。
 このように本実施形態によれば、運転訓練を受けるユーザが必要な機器を選択する手間を省くだけでなく、さまざまなレベルのユーザにプラントの運転訓練の機会を与えるのに役立つ効果を得ることができる。
By providing such setting means, it is possible to reduce the number of interface images to be displayed in the selection area, and save the user from having to perform driving training to select necessary equipment.
This function can be used, for example, to select a device to be displayed in accordance with the training content of a user who is trained by a driving training instructor. For example, when the level of a user who is trained is a beginner, it is desirable to display only the interface images of the minimum necessary devices, but this is possible by using the settings of this embodiment. This function can also limit the display of devices that are not normally used.
Note that this embodiment can be used in combination with the first embodiment. In that case, for example, control is performed as follows. The detailed screen display control unit 32 refers to the table in which the detailed screen selection setting unit 35 records whether or not to display for each device. When the distance from the avatar becomes smaller than the first threshold, “display” is set in this table. Only the image of the connected device is displayed. When combined with the first embodiment, an interface image of a necessary device can be displayed to a user who does not have knowledge of selecting the necessary device from the selection area.
As described above, according to the present embodiment, it is possible to obtain an effect that not only saves the user of the operation training from selecting the necessary equipment but also provides opportunities for operation training of the plant to various levels of users. Can do.
<第四の実施の形態>
 以下、本発明の第四の実施形態によるプラント運転訓練装置の動作を図10~11を参照して説明する。
 第四の実施形態は、ユーザが操作している機器が備えられた部位を点滅させるなどして表示する点が第一~第三の実施形態と異なる。
 図10は本実施形態によるプラント運転訓練装置1の操作画面を示す図である。
 図10を用いて対応部位表示制御部33がユーザの選択したインタフェース画像に対応する機器が備えられた部位を他の機器類と異なる態様で表示する動作について説明する。
 図10は、ユーザの操作によってアバター101がプラント内の場所100へ移動したときに、高圧給水弁後弁103のインタフェース詳細画面107が操作可能エリア104に表示されたことを示す3D画像である。高圧給水弁後弁103は、アバター101と第1閾値より小さい距離内に備えられており、符号113は、高圧給水弁後弁103が備えられた部位を示している。
<Fourth embodiment>
The operation of the plant operation training apparatus according to the fourth embodiment of the present invention will be described below with reference to FIGS.
The fourth embodiment is different from the first to third embodiments in that the part provided with the device operated by the user is displayed by blinking.
FIG. 10 is a diagram showing an operation screen of the plant operation training apparatus 1 according to the present embodiment.
The operation in which the corresponding part display control unit 33 displays the part provided with the device corresponding to the interface image selected by the user in a manner different from other devices will be described with reference to FIG.
FIG. 10 is a 3D image showing that the interface detail screen 107 of the high-pressure water supply valve rear valve 103 is displayed in the operable area 104 when the avatar 101 moves to the location 100 in the plant by the user's operation. The high-pressure water supply valve rear valve 103 is provided within a distance smaller than the first threshold value with the avatar 101, and reference numeral 113 indicates a portion where the high-pressure water supply valve rear valve 103 is provided.
 ここでユーザが操作手段20によって操作可能エリア104のインタフェース詳細画面107を選択すると、対応部位表示制御部33はインタフェース詳細画面107の外周部分に赤色の枠を表示する。そして対応部位表示制御部33は高圧給水弁後弁103が備えられている部位113を点滅させるなどしてユーザの目に付きやすくする。
 また、多くの計器類を備えるタービンなどの大型設備機器に対してアバターが所定の距離内に近づくと、対応部位表示制御部33は、まずその大型設備機器をそれまでとは異なった色等で目立つように表示してもよい。そして、ユーザが操作可能エリア104にてある機器を選択したときに、対応部位表示制御部33は、その機器の備えられた部位を点滅して表示してもよい。
 本実施形態によればユーザは操作対象としている機器がどこに備えられているかを知ることができる。
 このような動作とすることで機器類の設置場所を把握しながら操作方法を学習できるので実際のプラントを用いた実地訓練を行ったときと同様の高い訓練効果を期待できる。
Here, when the user selects the interface detail screen 107 of the operable area 104 by the operation means 20, the corresponding part display control unit 33 displays a red frame on the outer peripheral portion of the interface detail screen 107. Then, the corresponding part display control unit 33 makes the part 113 provided with the high-pressure water supply valve rear valve 103 blink to make it easy for the user to see.
Moreover, when an avatar approaches a predetermined distance from a large equipment such as a turbine equipped with many instruments, the corresponding part display control unit 33 first displays the large equipment in a different color or the like. It may be displayed prominently. When the user selects a device in the operable area 104, the corresponding part display control unit 33 may blink and display the part provided with the device.
According to this embodiment, the user can know where the device to be operated is provided.
With such an operation, it is possible to learn the operation method while grasping the installation location of the equipment, so that it is possible to expect the same high training effect as when performing a practical training using an actual plant.
 図11は本実施形態による表示装置の処理フローを示す図である。
 図11の処理フローを用いて図10で説明した処理について詳しく説明する。
前提として、ユーザの指示によりアバター101は移動し、プラント内の場所100に存在するものとする。また、アバター101と第1閾値より小さい距離範囲内にある機器類のインタフェース詳細画面107が操作可能エリア104に表示されているものとする。
 まず、ユーザが操作手段20を操作することにより、操作可能エリア104に表示されたインタフェース詳細画面107の中から1つを選択する。すると対応部位表示制御部33は、当該インタフェース詳細画面107をアクティブにし、その外周部分に赤色の枠を表示する制御を行う(ステップS31)。
 次に、対応部位表示制御部33は、選択されたインタフェース詳細画面107に対応する機器の3D画像を点滅表示させる。具体的には対応部位表示制御部33は選択されたインタフェース詳細画面107に対応する機器の識別子を用いて当該機器の3D画像を生成するプログラムを特定する。そして対応部位表示制御部33は特定したプログラムに対し、当該部位の色を変更又は点滅させるよう指示する(ステップS32)。例えば3D画像を生成するプログラムに変更後の色彩を示すRGB値を指定して当該機器を点滅表示させたり、色を変更してもよい。なお、変更又は点滅させる色はユーザの目に付きやすいように目立つ色を選択することが好ましい。
 次に対応部位表示制御部33は、ユーザの選択操作により操作可能エリアに表示された別のインタフェース表示画面が選択されたかどうかを検出する(ステップS33)。別のインタフェース画面が選択されると、ステップS31からの処理を繰り返す。また、非選択となった機器については当該機器の3D画像を生成するプログラムに対して元の表示態様に戻すよう指示を行う。
FIG. 11 is a diagram showing a processing flow of the display device according to the present embodiment.
The processing described in FIG. 10 will be described in detail using the processing flow in FIG.
As a premise, it is assumed that the avatar 101 moves in accordance with a user instruction and exists at a location 100 in the plant. Further, it is assumed that an interface detail screen 107 of devices within a distance range smaller than the first threshold is displayed in the operable area 104 with the avatar 101.
First, when the user operates the operation means 20, one is selected from the interface details screen 107 displayed in the operable area 104. Then, the corresponding part display control unit 33 activates the interface detail screen 107 and performs control to display a red frame on the outer peripheral portion (step S31).
Next, the corresponding part display control unit 33 blinks and displays the 3D image of the device corresponding to the selected interface detail screen 107. Specifically, the corresponding part display control unit 33 specifies a program for generating a 3D image of the device using the identifier of the device corresponding to the selected interface detail screen 107. Then, the corresponding part display control unit 33 instructs the specified program to change or blink the color of the part (step S32). For example, an RGB value indicating the changed color may be specified in a program for generating a 3D image, and the device may be blinked and the color may be changed. In addition, it is preferable to select a conspicuous color so that the color to be changed or blinked is easily noticeable by the user.
Next, the corresponding part display control unit 33 detects whether another interface display screen displayed in the operable area is selected by the user's selection operation (step S33). When another interface screen is selected, the processing from step S31 is repeated. In addition, for a non-selected device, an instruction is given to the program that generates a 3D image of the device to return to the original display mode.
<第五の実施の形態>
 以下、本発明の第五の実施形態によるプラント運転訓練装置の動作を図12A~13を参照して説明する。
 図12Aは、本実施形態によるプラント運転訓練装置1のインタフェース詳細画面に表示するパラメータ値を示す第一の図である。図12Bは、本実施形態によるプラント運転訓練装置1のインタフェース詳細画面に表示するパラメータ値を示す第二の図である。図12A及び図12Bは、アバター101が場所100に存在するときにアバター101と第1閾値より近い距離に備えられた機器類の一覧と各機器のインタフェース詳細画面に表示するために記憶部40が有しているパラメータ値を示している。
 図12Aは、各機器のインタフェース詳細画面107を操作可能エリアに表示する前の各機器のパラメータ値の値を示している。この図が示すようにこのときは未だ記憶部40は各機器のインタフェースに表示するパラメータ値を有していない。
 図12Bは、各機器のインタフェース詳細画面107を操作可能エリアに表示したときの各機器のパラメータ値の値を示している。この図が示すように記憶部40はプラントシミュレーション装置2から取得した各パラメータ値を有している。
<Fifth embodiment>
The operation of the plant operation training apparatus according to the fifth embodiment of the present invention will be described below with reference to FIGS. 12A to 13.
FIG. 12A is a first diagram showing parameter values displayed on the interface details screen of the plant operation training apparatus 1 according to the present embodiment. FIG. 12B is a second diagram showing parameter values displayed on the interface details screen of the plant operation training apparatus 1 according to the present embodiment. 12A and 12B, when the avatar 101 exists at the location 100, the storage unit 40 displays the list of devices provided at a distance closer to the avatar 101 than the first threshold and the interface details screen of each device. The parameter value which has is shown.
FIG. 12A shows the parameter value values of each device before the interface detail screen 107 of each device is displayed in the operable area. As shown in this figure, at this time, the storage unit 40 does not yet have a parameter value to be displayed on the interface of each device.
FIG. 12B shows parameter values of each device when the interface detail screen 107 of each device is displayed in the operable area. As shown in this figure, the storage unit 40 has each parameter value acquired from the plant simulation apparatus 2.
 プラント運転訓練装置1において、常に全てのパラメータ値を通信して取得すると通信処理のためにプラント運転訓練装置1のCPUの処理能力が消費され、計算負荷が高くなってしまう。その結果、運転訓練のための3D画像の生成処理に影響が出て3D画像の表示を滑らかに表示できなくなる。本実施形態によれば、必要なパラメータ値だけを抽出し、通信することで通信負荷を下げ、3D仮想プラント画像を表示するプラント運転訓練装置1の計算負荷を低減する。それにより、3D画像表示を滑らかに表示できるという効果が得られる。
 また、プラント運転訓練装置1、プラントシミュレーション装置2(及び中央制御室の訓練シミュレータ装置)からなるプラント運転訓練システムは、同時に複数のユーザが使用することを想定して設計されている。例えばユーザAが、プラント運転訓練装置1Aを用いて機器Aの操作方法を訓練し、同時に別のユーザBが、プラント運転訓練装置1Bを用いて機器Bの操作方法を訓練する。プラント運転訓練装置1A、1Bは共にプラントシミュレーション装置2とネットワークを介して接続されている。ユーザAが機器Aに対して操作Aを行いながら、並行してユーザBが機器Bに対して操作Bを行うと、プラントシミュレーション装置2はそれらの操作A,Bがプラントの状態に与える影響を算出し、その結果をプラント運転訓練装置1A、1Bに送信する。このようにこのシステムは1つのプラントに対して同時に複数個所で操作を行うような運転訓練の機能を提供する。
 このように複数のユーザが訓練を行う場面でプラントが備える全ての機器類のパラメータ値をプラントの状態が変化する毎に各プラント運転訓練装置1とプラントシミュレーション装置2との間で通信するとそれだけでシステムのネットワーク帯域を消費してしまう。本実施形態によれば必要最低限のパラメータ値の通信に限定することでシステム全体のネットワーク負荷を低減する効果が得られる。
If all the parameter values are always acquired by communicating in the plant operation training apparatus 1, the processing capacity of the CPU of the plant operation training apparatus 1 is consumed for communication processing, and the calculation load increases. As a result, 3D image generation processing for driving training is affected, and the 3D image display cannot be displayed smoothly. According to the present embodiment, only necessary parameter values are extracted and communicated to reduce the communication load, thereby reducing the calculation load of the plant operation training apparatus 1 that displays the 3D virtual plant image. Thereby, the effect that 3D image display can be displayed smoothly is acquired.
Moreover, the plant operation training system which consists of the plant operation training apparatus 1 and the plant simulation apparatus 2 (and the training simulator apparatus of the central control room) is designed on the assumption that a plurality of users use it at the same time. For example, the user A trains the operation method of the device A using the plant operation training apparatus 1A, and at the same time, another user B trains the operation method of the device B using the plant operation training apparatus 1B. The plant operation training apparatuses 1A and 1B are both connected to the plant simulation apparatus 2 via a network. When the user A performs the operation A on the device A while the user B performs the operation B on the device B in parallel, the plant simulation apparatus 2 influences the operations A and B on the state of the plant. It calculates and transmits the result to the plant operation training apparatuses 1A and 1B. In this way, this system provides a function of operation training in which a single plant is operated at a plurality of locations at the same time.
In this way, when a plurality of users perform training, parameter values of all the devices included in the plant are communicated between each plant operation training device 1 and the plant simulation device 2 every time the state of the plant changes. It consumes the network bandwidth of the system. According to the present embodiment, the effect of reducing the network load of the entire system can be obtained by limiting the communication to the minimum necessary parameter value.
 図13は本実施形態による表示装置の処理フローを示す第一の図である。
 図13の処理フローを用いて図12A、図12Bで説明した処理について説明する。
 前提として、ユーザの指示によりアバター101は移動し、プラント内の場所100に存在するものとする。また、この処理フローが実行されるのは詳細画面表示制御部32がアバター101と第1閾値より小さい距離範囲内にある機器類のインタフェース詳細画面を操作可能エリア104に表示する直前であるものとする。
 まず、詳細画面表示制御部32は、操作可能エリア104にインタフェース詳細画面107を表示する対象となる機器のうち1つ機器の識別子(TAG)を図4のステップS4と同様にして取得し、TAGの値をパラメータ取得部36に出力する(ステップS41)。次にパラメータ取得部36は、通信部50を介してプラントシミュレーション装置2に当該TAGについての最新のパラメータ値を要求する(ステップS42)。
 ここで、プラントシミュレーション装置2はその時々のプラント状態量を計算してTAG情報毎に保有しているとする。そしてプラントシミュレーション装置2はパラメータ取得部36からの要求に対して最新のパラメータ値を送信する。
 パラメータ取得部36は、通信部50を介してプラントシミュレーション装置2から要求したTAGに対応する機器のパラメータ値を取得する(ステップS43)。そしてパラメータ取得部36は、取得したパラメータ値を詳細画面表示制御部32に出力する。次に詳細画面表示制御部32は取得したパラメータ値を含むインタフェース詳細画面107を操作可能領域に出力する(ステップS44)。詳細画面表示制御部32は、操作可能エリア104に表示すべき全ての機器について上記の処理を行ったかどうかを判定し(ステップS45)、全ての機器についてステップS41~S44までの処理を繰り返す。
FIG. 13 is a first diagram illustrating a processing flow of the display device according to the present embodiment.
The processing described in FIGS. 12A and 12B will be described using the processing flow of FIG.
As a premise, it is assumed that the avatar 101 moves in accordance with a user instruction and exists at a location 100 in the plant. The processing flow is executed immediately before the detailed screen display control unit 32 displays the interface detailed screen of the devices within the distance range smaller than the first threshold value in the operable area 104 with the avatar 101. To do.
First, the detailed screen display control unit 32 acquires the identifier (TAG) of one device among the devices to be displayed on the interface detailed screen 107 in the operable area 104 in the same manner as step S4 in FIG. Is output to the parameter acquisition unit 36 (step S41). Next, the parameter acquisition unit 36 requests the latest parameter value for the TAG from the plant simulation device 2 via the communication unit 50 (step S42).
Here, it is assumed that the plant simulation apparatus 2 calculates the plant state quantity at each time and holds it for each TAG information. The plant simulation apparatus 2 transmits the latest parameter value in response to the request from the parameter acquisition unit 36.
The parameter acquisition unit 36 acquires the parameter value of the device corresponding to the TAG requested from the plant simulation device 2 via the communication unit 50 (step S43). The parameter acquisition unit 36 then outputs the acquired parameter value to the detailed screen display control unit 32. Next, the detailed screen display control unit 32 outputs the interface detailed screen 107 including the acquired parameter value to the operable area (step S44). The detailed screen display control unit 32 determines whether or not the above processing has been performed for all devices to be displayed in the operable area 104 (step S45), and repeats the processing from steps S41 to S44 for all devices.
 なお、上述のプラント運転訓練装置1は内部にコンピュータを有している。そして、上述したプラント運転訓練装置1の各処理の過程は、プログラムの形式でコンピュータ読み取り可能な記録媒体に記憶されており、このプログラムをコンピュータが読み出して実行することによって、上記処理が行われる。ここでコンピュータ読み取り可能な記録媒体とは、磁気ディスク、光磁気ディスク、CD-ROM、DVD-ROM、半導体メモリ等をいう。また、このコンピュータプログラムを通信回線によってコンピュータに配信し、この配信を受けたコンピュータが当該プログラムを実行するようにしてもよい。 The plant operation training apparatus 1 described above has a computer inside. Each process of the plant operation training apparatus 1 described above is stored in a computer-readable recording medium in the form of a program, and the above process is performed by the computer reading and executing this program. Here, the computer-readable recording medium means a magnetic disk, a magneto-optical disk, a CD-ROM, a DVD-ROM, a semiconductor memory, or the like. Alternatively, the computer program may be distributed to the computer via a communication line, and the computer that has received the distribution may execute the program.
 また、上記プログラムは、前述した機能の一部を実現するためのものであってもよい。さらに、前述した機能をコンピュータシステムにすでに記録されているプログラムとの組み合わせで実現できるもの、いわゆる差分ファイル(差分プログラム)であってもよい。 Further, the program may be for realizing a part of the functions described above. Furthermore, what can implement | achieve the function mentioned above in combination with the program already recorded on the computer system, what is called a difference file (difference program) may be sufficient.
 その他、本発明の趣旨を逸脱しない範囲で、上記した実施の形態における構成要素を周知の構成要素に置き換えることは適宜可能である。また、この発明の技術範囲は上記の実施形態に限られるものではなく、本発明の趣旨を逸脱しない範囲において種々の変更を加えることが可能である。例えば、3D仮想空間でなく2D仮想空間でもよい。 In addition, it is possible to appropriately replace the constituent elements in the above-described embodiments with known constituent elements without departing from the spirit of the present invention. The technical scope of the present invention is not limited to the above-described embodiment, and various modifications can be made without departing from the spirit of the present invention. For example, a 2D virtual space may be used instead of the 3D virtual space.
 上述したプラント運転訓練装置、制御方法、プログラム及びプラント運転訓練システムによれば、3D仮想空間内で、プラントに備えられた機器との位置関係を把握しながら現実の操作により近い運転訓練を行うことができるので、より高い訓練効果を得ることができる。 According to the plant operation training apparatus, the control method, the program, and the plant operation training system described above, in the 3D virtual space, the operation training closer to the actual operation is performed while grasping the positional relationship with the equipment provided in the plant. Can achieve higher training effects.
 1   プラント運転訓練装置
 2   プラントシミュレーション装置
 10  表示部
 20  撮像部
 30  仮想空間生成部
 31  位置関係算出部
 32  詳細画面表示制御部
 33  対応部位表示制御部
 34  詳細画面表示選択部
 35  詳細画面選択設定部
 36  パラメータ取得部
 40  記憶部
 50  通信部
 100 場所
 101 アバター
 102 高圧給水流量計
 103 高圧給水弁後弁
 104 操作可能エリア
 105 選択エリア
 106 操作不可エリア
 107 インタフェース詳細画面
 108 OKボタン
 109 キャンセルボタン
 110 設定画面
 111 赤枠
 112 チェックボックス
 113 部位
DESCRIPTION OF SYMBOLS 1 Plant operation training apparatus 2 Plant simulation apparatus 10 Display part 20 Imaging part 30 Virtual space generation part 31 Positional relationship calculation part 32 Detailed screen display control part 33 Corresponding part display control part 34 Detailed screen display selection part 35 Detailed screen selection setting part 36 Parameter acquisition unit 40 Storage unit 50 Communication unit 100 Location 101 Avatar 102 High-pressure feed flow meter 103 High-pressure feed valve rear valve 104 Operable area 105 Selection area 106 Unoperable area 107 Interface detail screen 108 OK button 109 Cancel button 110 Setting screen 111 Red Frame 112 Check box 113 parts

Claims (12)

  1.  仮想空間における人間を示す人間モデルデータを取得して当該人間モデルデータに基づいて仮想空間内に表示したアバターの動作を操作する操作手段と、
     仮想空間におけるプラントを示すプラントモデルデータを取得して当該プラントモデルデータに基づいて仮想空間内に表示したプラントが備える機器類と前記アバターとの位置関係を算出する位置関係算出部と、
     前記位置関係について所定の条件を満たす前記機器類のインタフェース詳細画面を仮想空間におけるインタフェース詳細画面を示すインタフェース詳細画面モデルデータを取得して当該インタフェース詳細画面モデルデータに基づいて表示する詳細画面表示制御部と、
     を備えるプラント運転訓練装置。
    Operation means for obtaining human model data representing a person in the virtual space and operating the motion of the avatar displayed in the virtual space based on the human model data;
    A positional relationship calculation unit that acquires plant model data indicating a plant in the virtual space and calculates a positional relationship between the avatar and the devices included in the plant displayed in the virtual space based on the plant model data;
    A detailed screen display control unit that acquires interface detail screen model data indicating an interface detail screen in a virtual space and displays the interface detail screen of the devices satisfying a predetermined condition with respect to the positional relationship based on the interface detail screen model data When,
    A plant operation training apparatus comprising:
  2.  前記所定の条件は、前記機器類が備えられた位置と前記アバターが存在する位置との距離が第1閾値よりも小さいことである
     請求項1に記載のプラント運転訓練装置。
    The plant operation training apparatus according to claim 1, wherein the predetermined condition is that a distance between a position where the devices are provided and a position where the avatar exists is smaller than a first threshold.
  3.  前記詳細画面表示制御部は、前記機器類が備えられた位置と前記アバターが存在する位置との距離が第2閾値以上となると前記インタフェース詳細画面を非表示とする
     請求項1又は請求項2に記載のプラント運転訓練装置。
    The said detailed screen display control part does not display the said interface detailed screen, when the distance of the position with which the said equipment was provided, and the position where the said avatar exists becomes more than a 2nd threshold value. The plant operation training apparatus described.
  4.  前記所定の条件は、前記アバターが存在する位置が前記機器類が備えられた空間内であることである
     請求項1から請求項3の何れか1項に記載のプラント運転訓練装置。
    The plant operation training apparatus according to any one of claims 1 to 3, wherein the predetermined condition is that a position where the avatar exists is in a space in which the devices are provided.
  5.  前記詳細画面表示制御部は、前記アバターが存在する位置が前記機器類が備えられた空間の外となると前記インタフェース詳細画面を非表示とする
     請求項1から請求項4の何れか1項に記載のプラント運転訓練装置。
    The said detailed screen display control part does not display the said interface detailed screen, if the position where the said avatar exists becomes out of the space where the said devices were provided, The interface detailed screen is any one of Claims 1-4. Plant operation training equipment.
  6.  前記所定の条件を満たす機器類のうちインタフェース詳細画面を表示する機器を選択する手段を提供する詳細画面表示選択部をさらに備え、
     前記詳細画面表示制御部は、前記詳細画面表示選択部に対するユーザの選択操作に基づいてインタフェース詳細画面を表示する
     請求項1から請求項5の何れか1項に記載のプラント運転訓練装置。
    A detailed screen display selection unit that provides means for selecting a device that displays an interface detail screen among the devices that satisfy the predetermined condition;
    The plant operation training apparatus according to claim 1, wherein the detailed screen display control unit displays an interface detailed screen based on a user's selection operation on the detailed screen display selection unit.
  7.  前記詳細画面表示選択部が提供する選択可能な機器の選択肢を設定する詳細画面選択設定部をさらに備え、
     前記詳細画面表示選択部は前記設定に基づいて機器を選択する手段を提供する
     請求項6に記載のプラント運転訓練装置。
    A detailed screen selection setting unit that sets selectable device options provided by the detailed screen display selection unit;
    The plant operation training apparatus according to claim 6, wherein the detailed screen display selection unit provides means for selecting a device based on the setting.
  8.  前記インタフェース詳細画面を表示した機器が備えられたプラントの部位を他の前記機器類と異なる態様で表示する対応部位表示制御部をさらに備える
     請求項1から請求項7の何れか1項に記載のプラント運転訓練装置。
    The corresponding part display control part which displays the site | part of the plant with which the apparatus which displayed the said interface detail screen was provided in the aspect different from the said other apparatuses is further provided. Plant operation training device.
  9.  前記詳細画面表示制御部が前記インタフェース詳細画面の表示を開始すると当該インタフェース詳細画面に表示するパラメータの値を要求するパラメータ取得部をさらに備える
     請求項1から請求項8の何れか1項に記載のプラント運転訓練装置。
    The parameter acquisition part which requests | requires the value of the parameter displayed on the said interface detail screen, when the said detail screen display control part starts the display of the said interface detail screen, It further has any 1 item | term of Claim 8 Plant operation training device.
  10.  仮想空間における人間を示す人間モデルデータを取得して当該人間モデルデータに基づいて仮想空間内に表示したアバターの動作を操作し、
     仮想空間におけるプラントを示すプラントモデルデータを取得して当該プラントモデルデータに基づいて仮想空間内に表示したプラントが備える機器類と前記アバターとの位置関係を算出し、
     前記位置関係について所定の条件を満たす前記機器類のインタフェース詳細画面を仮想空間におけるインタフェースを示すインタフェース詳細画面モデルデータを取得して当該インタフェース詳細画面モデルデータに基づいて表示する
     プラント運転訓練装置の制御方法。
    Obtaining human model data indicating a person in the virtual space and operating the avatar movement displayed in the virtual space based on the human model data,
    Obtaining the plant model data indicating the plant in the virtual space and calculating the positional relationship between the avatar and the equipment included in the plant displayed in the virtual space based on the plant model data,
    A method for controlling a plant operation training apparatus that acquires interface detail screen model data indicating an interface in a virtual space and displays the interface detail screen of the devices that satisfy a predetermined condition for the positional relationship based on the interface detail screen model data .
  11.  プラント運転訓練装置のコンピュータを
    仮想空間における人間を示す人間モデルデータを取得して当該人間モデルデータに基づいて仮想空間内に表示したアバターの動作を操作する手段、
     仮想空間におけるプラントを示すプラントモデルデータを取得して当該プラントモデルデータに基づいて仮想空間内に表示したプラントが備える機器類と前記アバターとの位置関係を算出する手段、
     前記位置関係について所定の条件を満たす前記機器類のインタフェース詳細画面を仮想空間におけるインタフェースを示すインタフェース詳細画面モデルデータを取得して当該インタフェース詳細画面モデルデータに基づいて表示する手段、
     として機能させるためのプログラム。
    Means for obtaining human model data representing a person in the virtual space by a computer of the plant operation training apparatus and operating the motion of the avatar displayed in the virtual space based on the human model data;
    Means for obtaining plant model data indicating a plant in the virtual space and calculating a positional relationship between the avatar and the equipment included in the plant displayed in the virtual space based on the plant model data;
    Means for acquiring an interface detail screen model data indicating an interface in a virtual space and displaying the interface detail screen of the devices satisfying a predetermined condition for the positional relationship based on the interface detail screen model data;
    Program to function as.
  12.  仮想空間における人間を示す人間モデルデータを取得して当該人間モデルデータに基づいて仮想空間内に表示したアバターの動作を操作する操作手段と、
     仮想空間におけるプラントを示すプラントモデルデータを取得して当該プラントモデルデータに基づいて仮想空間内に表示したプラントが備える機器類と前記アバターとの位置関係を算出する位置関係算出部と、
     前記位置関係について所定の条件を満たす機器類のインタフェース詳細画面を仮想空間におけるインタフェースを示すインタフェース詳細画面モデルデータを取得して当該インタフェース詳細画面モデルデータに基づいて表示する詳細画面表示制御部と、
     前記詳細画面表示制御部が前記インタフェース詳細画面の表示を開始すると当該インタフェースに表示するパラメータの値を要求するパラメータ取得部と、
     を備えるプラント運転訓練装置と、
     プラントに備えられた機器類の状態を示すパラメータ値を算出し、前記要求に対して算出したパラメータ値を送信するプラントシミュレーション装置と、
     を備えるプラント運転訓練システム。
    Operation means for obtaining human model data representing a person in the virtual space and operating the motion of the avatar displayed in the virtual space based on the human model data;
    A positional relationship calculation unit that acquires plant model data indicating a plant in the virtual space and calculates a positional relationship between the avatar and the devices included in the plant displayed in the virtual space based on the plant model data;
    A detailed screen display control unit that acquires interface detailed screen model data indicating an interface in a virtual space and displays the interface detailed screen of the device that satisfies a predetermined condition for the positional relationship based on the interface detailed screen model data;
    A parameter acquisition unit that requests a value of a parameter to be displayed on the interface when the detailed screen display control unit starts displaying the interface detail screen;
    A plant operation training apparatus comprising:
    A plant simulation device for calculating a parameter value indicating a state of equipment provided in the plant, and transmitting the calculated parameter value in response to the request;
    A plant operation training system comprising:
PCT/JP2014/076814 2013-10-11 2014-10-07 Plant operation training apparatus, control method, program, and plant operation training system WO2015053266A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2013213698A JP6159217B2 (en) 2013-10-11 2013-10-11 Plant operation training apparatus, control method, program, and plant operation training system
JP2013-213698 2013-10-11

Publications (1)

Publication Number Publication Date
WO2015053266A1 true WO2015053266A1 (en) 2015-04-16

Family

ID=52813082

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/076814 WO2015053266A1 (en) 2013-10-11 2014-10-07 Plant operation training apparatus, control method, program, and plant operation training system

Country Status (2)

Country Link
JP (1) JP6159217B2 (en)
WO (1) WO2015053266A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20180132132A (en) 2016-05-23 2018-12-11 미츠비시 히타치 파워 시스템즈 가부시키가이샤 A three-dimensional data display device, a three-dimensional data display method, and a computer readable recording medium recording a program
US10600446B2 (en) 2016-02-29 2020-03-24 Mitsubishi Hitachi Power Systems, Ltd. Video reproducing device, video reproducing method, and program
US10748443B2 (en) 2017-06-08 2020-08-18 Honeywell International Inc. Apparatus and method for visual-assisted training, collaboration, and monitoring in augmented/virtual reality in industrial automation systems and other systems
CN112002172A (en) * 2020-09-11 2020-11-27 北京金控数据技术股份有限公司 Three-dimensional simulation system of sewage treatment plant
CN112400198A (en) * 2018-08-29 2021-02-23 松下知识产权经营株式会社 Display system, server, display method and device

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7014698B2 (en) * 2018-11-20 2022-02-01 株式会社日立システムズ Training material presentation system and training material presentation method
JP7012632B2 (en) * 2018-11-20 2022-01-28 株式会社日立システムズ Training material presentation system and training material presentation method
JP7317322B2 (en) * 2021-03-29 2023-07-31 グリー株式会社 Information processing system, information processing method and computer program
JP7016438B1 (en) 2021-03-29 2022-02-18 グリー株式会社 Information processing systems, information processing methods and computer programs
KR102581805B1 (en) * 2021-09-17 2023-09-25 이준석 Method and system for training plant operator by metaverse server

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000122520A (en) * 1998-10-14 2000-04-28 Mitsubishi Heavy Ind Ltd Virtual reality simulator and simulation method therefor
JP2002304112A (en) * 2001-04-06 2002-10-18 Mitsubishi Heavy Ind Ltd Method and device for plant design, and program
JP2005070161A (en) * 2003-08-20 2005-03-17 Mitsubishi Chemicals Corp Simulation system for training
JP2006072193A (en) * 2004-09-06 2006-03-16 Central Res Inst Of Electric Power Ind Training system and training method
JP2014077896A (en) * 2012-10-11 2014-05-01 Mitsubishi Electric Corp Operation training simulator of nuclear power plant

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000122520A (en) * 1998-10-14 2000-04-28 Mitsubishi Heavy Ind Ltd Virtual reality simulator and simulation method therefor
JP2002304112A (en) * 2001-04-06 2002-10-18 Mitsubishi Heavy Ind Ltd Method and device for plant design, and program
JP2005070161A (en) * 2003-08-20 2005-03-17 Mitsubishi Chemicals Corp Simulation system for training
JP2006072193A (en) * 2004-09-06 2006-03-16 Central Res Inst Of Electric Power Ind Training system and training method
JP2014077896A (en) * 2012-10-11 2014-05-01 Mitsubishi Electric Corp Operation training simulator of nuclear power plant

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
HIROTAKE ISHII ET AL.: "Development of a VR- based Experienceable Education System for Operating Nuclear Power Plants", JOURNAL OF HUMAN INTERFACE SOCIETY, vol. 2, no. 4, 31 December 2000 (2000-12-31), pages 331 - 340, Retrieved from the Internet <URL:https://hydro.energy.kyoto-u.ac.jp/Lab/staff/hirotake/paper/papers/journal/HI00.pdf> [retrieved on 20141212] *
MICHIYA YAMAMOTO ET AL.: "A Simulation Method of Distributed Virtual Environment for Training of Maintenance Work in Large-scale Plants", TRANSACTIONS OF THE VIRTUAL REALITY SOCIETY OF JAPAN, vol. 5, no. 4, 31 December 2000 (2000-12-31), pages 1103 - 1112 *
MIWAKO DOI ET AL.: "Virtual Environment For Plant Engineering", IEICE TECHNICAL REPORT, vol. 95, no. 441, 15 December 1995 (1995-12-15), pages 29 - 35 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10600446B2 (en) 2016-02-29 2020-03-24 Mitsubishi Hitachi Power Systems, Ltd. Video reproducing device, video reproducing method, and program
KR20180132132A (en) 2016-05-23 2018-12-11 미츠비시 히타치 파워 시스템즈 가부시키가이샤 A three-dimensional data display device, a three-dimensional data display method, and a computer readable recording medium recording a program
DE112017002621T5 (en) 2016-05-23 2019-03-28 Mitsubishi Hitachi Power Systems, Ltd. Three-dimensional data display device, three-dimensional data display method and program
US10643387B2 (en) 2016-05-23 2020-05-05 Mitsubishi Hitachi Power Systems, Ltd. Three-dimensional data display device, three-dimensional data display method, and program
US10748443B2 (en) 2017-06-08 2020-08-18 Honeywell International Inc. Apparatus and method for visual-assisted training, collaboration, and monitoring in augmented/virtual reality in industrial automation systems and other systems
CN112400198A (en) * 2018-08-29 2021-02-23 松下知识产权经营株式会社 Display system, server, display method and device
CN112002172A (en) * 2020-09-11 2020-11-27 北京金控数据技术股份有限公司 Three-dimensional simulation system of sewage treatment plant
CN112002172B (en) * 2020-09-11 2022-11-29 北京金控数据技术股份有限公司 Three-dimensional simulation system of sewage treatment plant

Also Published As

Publication number Publication date
JP6159217B2 (en) 2017-07-05
JP2015075732A (en) 2015-04-20

Similar Documents

Publication Publication Date Title
JP6159217B2 (en) Plant operation training apparatus, control method, program, and plant operation training system
US11948239B2 (en) Time-dependent client inactivity indicia in a multi-user animation environment
Hilfert et al. Low-cost virtual reality environment for engineering and construction
EP3998596B1 (en) Augmented reality simulator for professional and educational training
JP4777182B2 (en) Mixed reality presentation apparatus, control method therefor, and program
US20120122062A1 (en) Reconfigurable platform management apparatus for virtual reality-based training simulator
CN107510506A (en) Utilize the surgical robot system and its control method of augmented reality
WO2009003169A2 (en) Display-based interactive simulation with dynamic panorama
KR20160020136A (en) Training system for treating disaster using virtual reality and role playing game
Bergroth et al. Use of immersive 3-D virtual reality environments in control room validations
CN108196669A (en) Modification method, device, processor and the head-mounted display apparatus of avatar model
de Farias Paiva et al. A collaborative and immersive VR simulator for education and assessment of surgical teams
CA2923191A1 (en) Patient simulation system adapted for interacting with a medical apparatus
JP7530754B2 (en) Educational support system, method and program
JP2020013035A (en) Training device, training system, training method and program
CN115485737A (en) Information processing apparatus, information processing method, and program
US11797093B2 (en) Integrating tactile nonvirtual controls in a virtual reality (VR) training simulator
CN102132294A (en) Method, system and computer program product for providing simulation with advance notification of events
Chen et al. WheelUp! Developing an Interactive Electric-power Wheelchair Virtual Training Environment
WO2023127403A1 (en) System for improving realistic sensations and program for improving realistic sensations in vr
US20240220006A1 (en) Virtual reality control system
US20240346736A1 (en) Information processing device, information processing method, and program
KR102515788B1 (en) A method for providing train program about industrial facility based on skill level
TW201113738A (en) Virtual pet developing system and method thereof
Mota et al. Mobile simulator for risk analysis

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14851627

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14851627

Country of ref document: EP

Kind code of ref document: A1