CN111679959A - Computer performance data determination method and device, computer equipment and storage medium - Google Patents
Computer performance data determination method and device, computer equipment and storage medium Download PDFInfo
- Publication number
- CN111679959A CN111679959A CN202010328656.1A CN202010328656A CN111679959A CN 111679959 A CN111679959 A CN 111679959A CN 202010328656 A CN202010328656 A CN 202010328656A CN 111679959 A CN111679959 A CN 111679959A
- Authority
- CN
- China
- Prior art keywords
- performance data
- computer
- sequence
- historical performance
- historical
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/30—Monitoring
- G06F11/34—Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
- G06F11/3447—Performance evaluation by modeling
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/23—Clustering techniques
- G06F18/232—Non-hierarchical techniques
- G06F18/2321—Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/049—Temporal neural networks, e.g. delay elements, oscillating neurons or pulsed inputs
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/084—Backpropagation, e.g. using gradient descent
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- General Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Physics & Mathematics (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Bioinformatics & Cheminformatics (AREA)
- General Health & Medical Sciences (AREA)
- Software Systems (AREA)
- Bioinformatics & Computational Biology (AREA)
- Mathematical Physics (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- Evolutionary Biology (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Probability & Statistics with Applications (AREA)
- Computer Hardware Design (AREA)
- Quality & Reliability (AREA)
- Debugging And Monitoring (AREA)
Abstract
The invention relates to a block chain technology, and provides a computer performance data determination method and related equipment, wherein the method acquires a plurality of historical performance data sequences of a computer; pre-training the long-term and short-term memory network by using a plurality of historical performance data sequences; classifying the plurality of historical performance data sequences to obtain N sequence types and N historical performance data sequence subsets which are in one-to-one correspondence with the N sequence types; training a performance pre-judging model consisting of a pre-trained long-short term memory network and a full connection layer behind the pre-trained long-short term memory network by using each historical performance data sequence subset to obtain N trained performance pre-judging models which correspond to N sequence types one by one; and prejudging the performance data of the computer through a trained performance prejudging model corresponding to the sequence type of the performance data sequence to be prejudged of the computer. Further, the present application relates to blockchain techniques, where the performance data may be stored in a blockchain.
Description
Technical Field
The invention relates to a block chain technology, in particular to a method and a device for determining computer performance data, computer equipment and a computer readable storage medium.
Background
In cloud computing platform service vendors, performance (e.g., capacity) prediction is an important AIOPS (artificial intelligence-based IT operations) application scenario. The conventional performance prediction method has low accuracy in performance prediction of cloud computing platform service manufacturers and poor long-term performance prediction effect. When large-scale performance indexes need to be predicted, if one model is used for predicting each performance index, parameters of the models need to be adjusted for different performance indexes are large and are not wanted to be different, and therefore required labor input cost is large. On the other hand, when training data for a certain performance index is less, the prediction of the performance index may be over-fitted.
Disclosure of Invention
In view of the foregoing, there is a need for a computer performance data determining method, apparatus, computer device and computer readable storage medium that can predict computer performance based on historical performance data of a computer.
A first aspect of the present application provides a computer performance data determination method, the method comprising:
acquiring a plurality of historical performance data sequences of a computer;
pre-training the long-short term memory network by using the plurality of historical performance data sequences to obtain a pre-trained long-short term memory network;
classifying the plurality of historical performance data sequences to obtain N sequence types and N historical performance data sequence subsets which are in one-to-one correspondence with the N sequence types;
training a performance pre-judging model consisting of the pre-trained long-short term memory network and a full connection layer positioned behind the pre-trained long-short term memory network by using each historical performance data sequence subset to obtain N trained performance pre-judging models which correspond to the N sequence types one by one;
and prejudging the performance data of the computer through a trained performance prejudging model corresponding to the sequence type of the performance data sequence to be prejudged of the computer.
A second aspect of the present application provides a computer performance data determination apparatus, the apparatus comprising:
the acquisition module is used for acquiring a plurality of historical performance data sequences of the computer;
the first training module is used for pre-training the long-short term memory network by using the plurality of historical performance data sequences to obtain the pre-trained long-short term memory network;
the classification module is used for classifying the plurality of historical performance data sequences to obtain N sequence types and N historical performance data sequence subsets which are in one-to-one correspondence with the N sequence types;
the second training module is used for training a performance pre-judging model consisting of the pre-trained long-short term memory network and a full connection layer positioned behind the pre-trained long-short term memory network by using each historical performance data sequence subset to obtain N trained performance pre-judging models which correspond to the N sequence types one by one;
and the prejudging module is used for prejudging the performance data of the computer through a trained performance prejudging model corresponding to the sequence type of the performance data sequence to be prejudged of the computer.
A third aspect of the application provides a computer device comprising a processor for implementing the computer performance data determination method when executing a computer program stored in a memory.
A fourth aspect of the present application provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the computer performance data determination method.
The method comprises the steps of obtaining a plurality of historical performance data sequences of a computer; pre-training the long-short term memory network by using the plurality of historical performance data sequences to obtain a pre-trained long-short term memory network; classifying the plurality of historical performance data sequences to obtain N sequence types and N historical performance data sequence subsets which are in one-to-one correspondence with the N sequence types; training a performance pre-judging model consisting of the pre-trained long-short term memory network and a full connection layer positioned behind the pre-trained long-short term memory network by using each historical performance data sequence subset to obtain N trained performance pre-judging models which correspond to the N sequence types one by one; and prejudging the performance data of the computer through a trained performance prejudging model corresponding to the sequence type of the performance data sequence to be prejudged of the computer. The invention realizes the pre-judgment of the computer performance according to the historical performance data of the computer.
Drawings
FIG. 1 is a flow chart of a method for determining computer performance data according to an embodiment of the present invention.
Fig. 2 is a block diagram of a computer performance data determining apparatus according to an embodiment of the present invention.
Fig. 3 is a schematic diagram of a computer device provided by an embodiment of the present invention.
Detailed Description
In order that the above objects, features and advantages of the present invention can be more clearly understood, a detailed description of the present invention will be given below with reference to the accompanying drawings and specific embodiments. It should be noted that the embodiments and features of the embodiments of the present application may be combined with each other without conflict.
In the following description, numerous specific details are set forth to provide a thorough understanding of the present invention, and the described embodiments are merely a subset of the embodiments of the present invention, rather than a complete embodiment. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention.
Preferably, the computer performance data determination method of the present invention is applied in one or more computer devices. The computer device is a device capable of automatically performing numerical calculation and/or information processing according to a preset or stored instruction, and the hardware includes, but is not limited to, a microprocessor, an Application Specific Integrated Circuit (ASIC), a Programmable Gate Array (FPGA), a Digital Signal Processor (DSP), an embedded device, and the like.
The computer device can be a desktop computer, a notebook, a palm computer, a cloud server and other computing devices. The computer equipment can carry out man-machine interaction with a user through a keyboard, a mouse, a remote controller, a touch panel or voice control equipment and the like.
Example one
Fig. 1 is a flowchart of a method for determining computer performance data according to an embodiment of the present invention. The computer performance data determining method is applied to computer equipment and used for pre-judging computer performance data according to historical performance data of a computer, and the computer performance data can comprise computer CPU data, computer GPU data, computer memory data or computer storage data.
As shown in fig. 1, the computer performance data determining method includes:
101, a plurality of historical performance data sequences of a computer are obtained.
In a specific embodiment, the obtaining the plurality of historical performance data sequences of the computer includes:
acquiring historical performance data main sequences of a plurality of computers, wherein each element in the historical performance data main sequences is a state vector of computer performance;
and intercepting a plurality of historical performance data sequences from each historical performance data main sequence by taking a preset length h as the length of a sliding window and 1 as a step length.
The next element of the historical performance data sequence in the historical performance data master sequence where each historical performance data sequence is located can be labeled as a label of the historical performance data sequence. For example, the historical performance data main sequence of a computer is { X1, X2, …, Xi, …, Xn }, wherein Xi is the state vector of the CPU at time point i; taking a preset length h as a sliding window length, and taking 1 as a step length to intercept a plurality of historical performance data sequences from each historical performance data master sequence, wherein the plurality of historical performance data sequences are { X1, X2, … Xh }, { X2, X3, … Xh +1}, { X3, X4, … Xh +2}, and the like; labels of the historical performance data sequences { X1, X2, … Xh }, { X2, X3, … Xh +1}, { X3, X4, … Xh +2} are Xh +1, Xh +2, and Xh +3, respectively.
Each historical performance data sequence is a time sequence.
And 102, pre-training the long-short term memory network by using the plurality of historical performance data sequences to obtain the pre-trained long-short term memory network.
Specifically, the parameters in the long-short term memory network may be initialized, each historical performance data sequence may be input into the long-short term memory network, the distance between the output value calculated by the long-short term memory network according to the input value and the tag of the historical performance data sequence may be calculated, and the parameters in the long-short term memory network may be optimized according to the distance.
In another embodiment, prior to pre-training the long-short term memory network using the plurality of historical performance data sequences, the method further comprises:
and preprocessing the plurality of historical performance data sequences.
Missing value preprocessing, abnormal data preprocessing, and the like can be performed on the plurality of historical performance data sequences. And the missing value pretreatment is carried out on the historical performance data sequence set, so that missing values do not exist in the plurality of historical performance sequences, and the influence on training of the long-term and short-term memory network is reduced. And carrying out abnormal data preprocessing on the plurality of historical performance data sequences, increasing the training convergence speed of the long-short term memory network, and reducing the time for training the long-short term memory network. When a historical performance data sequence has missing values (caused by data source errors or software design errors and the like), interpolation or replacement can be adopted to carry out missing value preprocessing on the historical performance data sequence. When abnormal data exist in a historical performance data sequence (namely the historical performance data sequence comprises a value which is not in a preset range), if the abnormal data are larger than the maximum value of the preset range, the abnormal data are set as the maximum value of the preset range; and if the abnormal data is smaller than the minimum value of the preset range, setting the abnormal data as the minimum value of the preset range.
103, classifying the plurality of historical performance data sequences to obtain N sequence types and N historical performance data sequence subsets corresponding to the N sequence types one to one.
In a specific embodiment, the classifying the plurality of historical performance data sequences includes:
clustering the plurality of historical performance data sequences, and modifying clustering results according to received modification instructions; or
And classifying the plurality of historical performance data sequences by using a trained preset convolutional neural network model.
Clustering a plurality of historical performance data sequences in the set of historical performance data sequences according to a DBScan clustering algorithm, wherein a distance function in the DBScan clustering algorithm uses a one-dimensional NCC (normalized cross correlation) algorithm; and when the clustering result is abnormal, deleting the historical performance data sequence which is far away from the clustering center and exceeds the preset range in the clustering result according to the received deleting instruction.
Specifically, the clustering the plurality of historical performance data sequences includes:
selecting N sequences of center points from the plurality of sequences of historical performance data;
calculating the distance between each historical performance data sequence and the N central point sequences to obtain a central point sequence closest to the historical performance data sequence;
and dividing the historical performance data sequence into clusters to which the central point sequence closest to the historical performance data sequence belongs.
Specifically, the clustering the plurality of historical performance data sequences includes:
determining a cluster by taking each historical performance data sequence as a center to obtain W clusters;
if W is larger than N, calculating the distance between every two clusters, and combining the two clusters with the minimum distance to obtain W-1 clusters;
if W-1 is larger than N, calculating the distance between every two clusters, and combining the two clusters with the minimum distance to obtain W-2 clusters; by analogy, when W-R equals N, N clusters result.
Before the classification of the plurality of historical performance data sequences by using the trained preset convolutional neural network model, training the preset convolutional neural network model:
inputting a historical performance data sequence into a preset convolutional neural network model, wherein each historical performance data sequence corresponds to a type tag;
and performing back propagation on the preset convolutional neural network model based on the output of the preset convolutional neural network model and the type label of the historical performance data sequence, and optimizing the parameters of the preset convolutional neural network model.
And 104, training a performance pre-judging model consisting of the pre-trained long-short term memory network and a full connection layer positioned behind the pre-trained long-short term memory network by using each historical performance data sequence subset to obtain N trained performance pre-judging models which are in one-to-one correspondence with the N sequence types.
In a specific embodiment, the training of a performance prediction model composed of the pre-trained long-short term memory network and a fully-connected layer located behind the pre-trained long-short term memory network with each historical performance data sequence subset includes:
judging whether the number of the historical performance data sequences is greater than a preset threshold value or not;
if the number of the historical performance data sequences is larger than a preset threshold value, optimizing parameters of a performance pre-judging model corresponding to the historical performance data sequence subset by using the historical performance data sequence subset according to a loss function;
and if the number of the historical performance data sequences is not larger than a preset threshold value, optimizing parameters of a full connection layer of a performance pre-judging model corresponding to the historical performance data sequence subset by using the historical performance data sequence subset according to a loss function.
When the number of the historical performance data sequences is not larger than a preset threshold value, only optimizing the parameters of the full connection layer of the performance pre-judging model corresponding to the historical performance data sequence subset can reduce the occurrence of overfitting, and the situation of low prediction accuracy cannot be caused due to the fact that the performance pre-judging model is pre-trained.
In another embodiment, the training of a performance prediction model composed of the pre-trained long-short term memory network and a fully-connected layer located behind the pre-trained long-short term memory network with each historical performance data sequence subset includes:
and optimizing the parameters of the performance pre-judging model corresponding to the historical performance data sequence subset by using the historical performance data sequence subset according to the loss function.
And 105, prejudging the performance data of the computer through a trained performance prejudging model corresponding to the sequence type of the performance data sequence to be prejudged of the computer.
Before prejudging the performance data of the computer through the trained performance prejudging model corresponding to the sequence type of the performance data sequence to be prejudged of the computer, the method further comprises:
(1) and acquiring a performance data sequence to be pre-judged of the computer.
For example, the computer CPU performance sequence is { X1, X2, … X15}, where X1 is (2019, 06, 12, 13, 15, 43), the first 5 dimensions are acquisition time, and the 6 th dimension is CPU usage of the computer at acquisition time; x2 ═ (2019, 06, 12, 13, 16, 80); x15 ═ (2019, 06, 12, 13, 30, 90). Similarly, the performance data sequence to be predicted of the computer further includes computer GPU performance, computer memory performance, or computer storage performance.
(2) And determining the sequence type of the performance data sequence to be pre-judged according to the N historical performance data sequence subsets.
In a specific embodiment, the determining the sequence type of the to-be-predicted performance data sequence according to the N historical performance data sequence subsets includes:
calculating N central sequences of N historical performance data sequence subsets, and determining a sequence type corresponding to the central sequence closest to the performance data sequence to be pre-judged as the target sequence type; or
The method comprises the steps of training a preset neural network by N historical performance data sequence subsets, inputting a to-be-prejudged performance data sequence into the trained preset neural network, and determining the sequence type of the to-be-prejudged performance data sequence according to the output of the trained preset neural network, wherein the label of each historical performance data sequence in each historical performance data sequence subset is the sequence type of the historical performance data sequence.
When the preset neural network is trained, the historical performance data sequence can be used as the input of the preset neural network, and the sequence type of the historical performance data sequence is used as the label of the historical performance data sequence; and optimizing the parameters of the preset neural network through a back propagation algorithm according to the output of the preset neural network and the label of the historical performance data sequence.
Prejudging the performance data of the computer through a trained performance prejudging model corresponding to a sequence type of a performance data sequence to be prejudged of the computer, for example, a computer CPU performance sequence is { X1, X2, … X15}, where X1 is (2019, 06, 12, 13, 15, 43), the first 5 dimensions are acquisition time, and the 6 th dimension is CPU utilization of the computer at the acquisition time; x2 ═ (2019, 06, 12, 13, 16, 80); x15 ═ (2019, 06, 12, 13, 30, 90). The performance sequence of the computer CPU corresponds to the performance pre-judging model A, and the performance pre-judging model A can more accurately extract the characteristics of the performance sequence of the computer CPU (because the performance pre-judging model A is trained according to the sequence type in the training process). Inputting a computer CPU performance sequence into a performance prediction model A, predicting the computer CPU performance through the performance prediction model A, and obtaining X16 ═ 2019, 06, 12, 13, 31 and 95.
The computer performance data determination method of the first embodiment obtains a historical performance data sequence set of a computer, wherein the historical performance data sequence set comprises a plurality of historical performance data sequences and a label of each historical performance data sequence; pre-training the long-short term memory network by using the historical performance data sequence set to obtain the pre-trained long-short term memory network; classifying the plurality of historical performance data sequences to obtain N sequence types and N historical performance data sequence subsets which are in one-to-one correspondence with the N sequence types; training a performance pre-judging model consisting of the pre-trained long-short term memory network and a full connection layer positioned behind the pre-trained long-short term memory network by using each historical performance data sequence subset to obtain N individual performance pre-judging models corresponding to the N sequence types one by one; acquiring a performance data sequence to be pre-judged of the computer; determining the sequence type of the performance data sequence to be pre-judged; and prejudging the performance data of the computer through a performance prejudging model corresponding to the sequence type of the performance data sequence to be prejudged. Embodiments predict computer performance based on historical performance data of the computer.
It is emphasized that the performance data may also be stored in a node of a block chain in order to further ensure privacy and security of the performance data.
In another embodiment, the method further comprises:
and if the performance data prediction result of the computer is not in the preset normal state range, returning a prompt for abnormal performance of the computer. When an exception occurs, stopping running a new task in the computer.
For example, the performance data prediction result of the computer is that X16 is equal to (2019, 06, 12, 13, 31, 95), 95 is the predicted CPU usage rate of the computer, the preset normal state range is 0-90, and the CPU abnormality reminding of the computer is returned to the user; and stopping running new tasks in the computer.
In another embodiment, the method further comprises:
(1) a plurality of historical CPU usage sequences for a computer are obtained.
For example, the historical CPU usage master sequence of a computer is { X1, X2, …, Xi, …, Xn }, where Xi is the state vector of the CPU at time i; taking a preset length h as a sliding window length, and taking 1 as a step length to intercept a plurality of historical CPU utilization rate sequences from each historical CPU utilization rate main sequence, wherein the historical CPU utilization rate sequences are { X1, X2, … Xh }, { X2, X3, … Xh +1}, { X3, X4, … Xh +2}, and the like; the tags of the historical CPU usage sequences { X1, X2, … Xh }, { X2, X3, … Xh +1}, { X3, X4, … Xh +2} are Xh +1, Xh +2, Xh +3, respectively.
Each historical CPU usage sequence is a time sequence.
(2) And pre-training the long-short term memory network by using the plurality of historical CPU utilization rate sequences to obtain the pre-trained long-short term memory network.
Specifically, the parameters in the long-short term memory network may be initialized, each historical CPU utilization sequence may be input into the long-short term memory network, the distance between the output value calculated by the long-short term memory network according to the input value and the tag of the historical CPU utilization sequence may be calculated, and the parameters in the long-short term memory network may be optimized according to the distance.
(3) And classifying the plurality of historical CPU utilization rate sequences to obtain N sequence types and N historical CPU utilization rate sequence subsets which are in one-to-one correspondence with the N sequence types.
In a specific embodiment, the classifying the plurality of historical CPU usage sequences includes:
clustering the plurality of historical CPU utilization rate sequences, and modifying clustering results according to received modification instructions; or
And classifying the plurality of historical CPU utilization rate sequences by using a trained preset convolutional neural network model.
(4) And training a CPU utilization rate pre-judging model consisting of the pre-trained long-short term memory network and a full connection layer positioned behind the pre-trained long-short term memory network by using each historical CPU utilization rate sequence subset to obtain N trained CPU utilization rate pre-judging models which are in one-to-one correspondence with the N sequence types.
(5) And prejudging the CPU performance of the computer through a trained CPU utilization rate prejudging model corresponding to the sequence type of the CPU utilization rate sequence to be prejudged of the computer.
And acquiring a CPU utilization rate sequence to be pre-judged of the computer.
For example, the computer CPU performance sequence is { X1, X2, … X15}, where X1 is (2019, 06, 12, 13, 15, 43), the first 5 dimensions are acquisition time, and the 6 th dimension is CPU usage of the computer at acquisition time; x2 ═ (2019, 06, 12, 13, 16, 80); x15 ═ (2019, 06, 12, 13, 30, 90).
And determining the sequence type of the CPU utilization rate sequence to be pre-judged according to the N historical CPU utilization rate sequence subsets.
The computer CPU performance sequence corresponds to the CPU utilization rate pre-judging model A. Inputting the computer CPU performance sequence into a CPU utilization rate pre-judging model A, and pre-judging the CPU performance of the computer by the CPU utilization rate pre-judging model A to obtain X16 ═ X (2019, 06, 12, 13, 31, 95).
Example two
Fig. 2 is a block diagram of a computer performance data determining apparatus according to a second embodiment of the present invention. The computer performance data determination apparatus 20 is applied to a computer device. The computer performance data determining means 20 is configured to pre-determine computer performance data, which may include computer CPU data, computer GPU data, computer memory data or computer memory data, based on historical performance data of the computer.
As shown in fig. 2, the computer performance data determining apparatus 20 may include an obtaining module 201, a first training module 202, a classifying module 203, a second training module 204, and a prejudging module 205.
The obtaining module 201 is configured to obtain a plurality of historical performance data sequences of the computer.
In a specific embodiment, the obtaining the plurality of historical performance data sequences of the computer includes:
acquiring historical performance data main sequences of a plurality of computers, wherein each element in the historical performance data main sequences is a state vector of computer performance;
and intercepting a plurality of historical performance data sequences from each historical performance data main sequence by taking a preset length h as the length of a sliding window and 1 as a step length.
The next element of the historical performance data sequence in the historical performance data master sequence where each historical performance data sequence is located can be labeled as a label of the historical performance data sequence. For example, the historical performance data main sequence of a computer is { X1, X2, …, Xi, …, Xn }, wherein Xi is the state vector of the CPU at time point i; taking a preset length h as a sliding window length, and taking 1 as a step length to intercept a plurality of historical performance data sequences from each historical performance data master sequence, wherein the plurality of historical performance data sequences are { X1, X2, … Xh }, { X2, X3, … Xh +1}, { X3, X4, … Xh +2}, and the like; labels of the historical performance data sequences { X1, X2, … Xh }, { X2, X3, … Xh +1}, { X3, X4, … Xh +2} are Xh +1, Xh +2, and Xh +3, respectively.
Each historical performance data sequence is a time sequence.
The first training module 202 is configured to pre-train the long-short term memory network using the plurality of historical performance data sequences to obtain a pre-trained long-short term memory network.
Specifically, the parameters in the long-short term memory network may be initialized, each historical performance data sequence may be input into the long-short term memory network, the distance between the output value calculated by the long-short term memory network according to the input value and the tag of the historical performance data sequence may be calculated, and the parameters in the long-short term memory network may be optimized according to the distance.
In another embodiment, prior to pre-training the long-short term memory network using the plurality of historical performance data sequences, the method further comprises:
and preprocessing the plurality of historical performance data sequences.
Missing value preprocessing, abnormal data preprocessing, and the like can be performed on the plurality of historical performance data sequences. And the missing value pretreatment is carried out on the historical performance data sequence set, so that missing values do not exist in the plurality of historical performance sequences, and the influence on training of the long-term and short-term memory network is reduced. And carrying out abnormal data preprocessing on the plurality of historical performance data sequences, increasing the training convergence speed of the long-short term memory network, and reducing the time for training the long-short term memory network. When a historical performance data sequence has missing values (caused by data source errors or software design errors and the like), interpolation or replacement can be adopted to carry out missing value preprocessing on the historical performance data sequence. When abnormal data exist in a historical performance data sequence (namely the historical performance data sequence comprises a value which is not in a preset range), if the abnormal data are larger than the maximum value of the preset range, the abnormal data are set as the maximum value of the preset range; and if the abnormal data is smaller than the minimum value of the preset range, setting the abnormal data as the minimum value of the preset range.
The classification module 203 is configured to classify the plurality of historical performance data sequences to obtain N sequence types and N historical performance data sequence subsets corresponding to the N sequence types one to one.
In a specific embodiment, the classifying the plurality of historical performance data sequences includes:
clustering the plurality of historical performance data sequences, and modifying clustering results according to received modification instructions; or
And classifying the plurality of historical performance data sequences by using a trained preset convolutional neural network model.
Clustering a plurality of historical performance data sequences in the set of historical performance data sequences according to a DBscan clustering algorithm, wherein a distance function in the DBscan clustering algorithm uses a one-dimensional NCC (normalized cross correlation) algorithm; and when the clustering result is abnormal, deleting the historical performance data sequence which is far away from the clustering center and exceeds the preset range in the clustering result according to the received deleting instruction.
Specifically, the clustering the plurality of historical performance data sequences includes:
selecting N sequences of center points from the plurality of sequences of historical performance data;
calculating the distance between each historical performance data sequence and the N central point sequences to obtain a central point sequence closest to the historical performance data sequence;
and dividing the historical performance data sequence into clusters to which the central point sequence closest to the historical performance data sequence belongs.
Specifically, the clustering the plurality of historical performance data sequences includes:
determining a cluster by taking each historical performance data sequence as a center to obtain W clusters;
if W is larger than N, calculating the distance between every two clusters, and combining the two clusters with the minimum distance to obtain W-1 clusters;
if W-1 is larger than N, calculating the distance between every two clusters, and combining the two clusters with the minimum distance to obtain W-2 clusters; by analogy, when W-R equals N, N clusters result.
Before the classification of the plurality of historical performance data sequences by using the trained preset convolutional neural network model, training the preset convolutional neural network model:
inputting a historical performance data sequence into a preset convolutional neural network model, wherein each historical performance data sequence corresponds to a type tag;
and performing back propagation on the preset convolutional neural network model based on the output of the preset convolutional neural network model and the type label of the historical performance data sequence, and optimizing the parameters of the preset convolutional neural network model.
A second training module 204, configured to train, with each historical performance data sequence subset, a performance pre-judging model formed by the pre-trained long-short term memory network and a full connection layer located behind the pre-trained long-short term memory network, so as to obtain N trained performance pre-judging models corresponding to the N sequence types one to one.
In a specific embodiment, the training of a performance prediction model composed of the pre-trained long-short term memory network and a fully-connected layer located behind the pre-trained long-short term memory network with each historical performance data sequence subset includes:
judging whether the number of the historical performance data sequences is greater than a preset threshold value or not;
if the number of the historical performance data sequences is larger than a preset threshold value, optimizing parameters of a performance pre-judging model corresponding to the historical performance data sequence subset by using the historical performance data sequence subset according to a loss function;
and if the number of the historical performance data sequences is not larger than a preset threshold value, optimizing parameters of a full connection layer of a performance pre-judging model corresponding to the historical performance data sequence subset by using the historical performance data sequence subset according to a loss function.
When the number of the historical performance data sequences is not larger than a preset threshold value, only optimizing the parameters of the full connection layer of the performance pre-judging model corresponding to the historical performance data sequence subset can reduce the occurrence of overfitting, and the situation of low prediction accuracy cannot be caused due to the fact that the performance pre-judging model is pre-trained.
In another embodiment, the training of a performance prediction model composed of the pre-trained long-short term memory network and a fully-connected layer located behind the pre-trained long-short term memory network with each historical performance data sequence subset includes:
and optimizing the parameters of the performance pre-judging model corresponding to the historical performance data sequence subset by using the historical performance data sequence subset according to the loss function.
The pre-judging module 205 is configured to pre-judge the performance data of the computer through a trained performance pre-judging model corresponding to a sequence type of a performance data sequence to be pre-judged of the computer.
Before prejudging the performance data of the computer through the trained performance prejudging model corresponding to the sequence type of the performance data sequence to be prejudged of the computer, the method further comprises:
(1) and acquiring a performance data sequence to be pre-judged of the computer.
For example, the computer CPU performance sequence is { X1, X2, … X15}, where X1 is (2019, 06, 12, 13, 15, 43), the first 5 dimensions are acquisition time, and the 6 th dimension is CPU usage of the computer at acquisition time; x2 ═ (2019, 06, 12, 13, 16, 80); x15 ═ (2019, 06, 12, 13, 30, 90). Similarly, the performance data sequence to be predicted of the computer further includes computer GPU performance, computer memory performance, or computer storage performance.
(2) And determining the sequence type of the performance data sequence to be pre-judged according to the N historical performance data sequence subsets.
In a specific embodiment, the determining the sequence type of the to-be-predicted performance data sequence according to the N historical performance data sequence subsets includes:
calculating N central sequences of N historical performance data sequence subsets, and determining a sequence type corresponding to the central sequence closest to the performance data sequence to be pre-judged as the target sequence type; or
The method comprises the steps of training a preset neural network by N historical performance data sequence subsets, inputting a to-be-prejudged performance data sequence into the trained preset neural network, and determining the sequence type of the to-be-prejudged performance data sequence according to the output of the trained preset neural network, wherein the label of each historical performance data sequence in each historical performance data sequence subset is the sequence type of the historical performance data sequence.
When the preset neural network is trained, the historical performance data sequence can be used as the input of the preset neural network, and the sequence type of the historical performance data sequence is used as the label of the historical performance data sequence; and optimizing the parameters of the preset neural network through a back propagation algorithm according to the output of the preset neural network and the label of the historical performance data sequence.
Prejudging the performance data of the computer through a trained performance prejudging model corresponding to a sequence type of a performance data sequence to be prejudged of the computer, for example, a computer CPU performance sequence is { X1, X2, … X15}, where X1 is (2019, 06, 12, 13, 15, 43), the first 5 dimensions are acquisition time, and the 6 th dimension is CPU utilization of the computer at the acquisition time; x2 ═ (2019, 06, 12, 13, 16, 80); x15 ═ (2019, 06, 12, 13, 30, 90). The performance sequence of the computer CPU corresponds to the performance pre-judging model A, and the performance pre-judging model A can more accurately extract the characteristics of the performance sequence of the computer CPU (because the performance pre-judging model A is trained according to the sequence type in the training process). Inputting a computer CPU performance sequence into a performance prediction model A, predicting the computer CPU performance through the performance prediction model A, and obtaining X16 ═ 2019, 06, 12, 13, 31 and 95.
The computer performance data determination apparatus 20 of the second embodiment pre-judges the computer performance based on the historical performance data of the computer.
In another embodiment, the computer performance data determining apparatus 20 further includes a returning module, configured to return a computer performance exception prompt if the performance data prediction result of the computer is not within the preset normal state range.
When an exception occurs, stopping running a new task in the computer.
For example, the performance data prediction result of the computer is that X16 is equal to (2019, 06, 12, 13, 31, 95), 95 is the predicted CPU usage rate of the computer, the preset normal state range is 0-90, and the CPU abnormality reminding of the computer is returned to the user; and stopping running new tasks in the computer.
EXAMPLE III
The present embodiment provides a computer-readable storage medium, which stores thereon a computer program, and when being executed by a processor, the computer program implements the steps in the above-mentioned method for determining computer performance data, such as the steps 101 and 105 shown in fig. 1.
Alternatively, the computer program, when executed by the processor, implements the functions of the modules in the above-described device embodiments, such as the module 201 and 205 in fig. 2.
Example four
Fig. 3 is a schematic diagram of a computer device according to a third embodiment of the present invention. The computer device 30 comprises a memory 301, a processor 302 and a computer program 303, e.g. a computer performance data determining program, stored in the memory 301 and executable on the processor 302. The processor 302, when executing the computer program 303, implements the steps in the above-mentioned method for determining computer performance data, such as 101-105 shown in fig. 1.
Alternatively, the computer program, when executed by the processor, implements the functions of the modules in the above-described device embodiments, such as the module 201 and 205 in fig. 2.
Illustratively, the computer program 303 may be partitioned into one or more modules that are stored in the memory 301 and executed by the processor 302 to perform the present method. The one or more modules may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution of the computer program 303 in the computer device 30. For example, the computer program 303 may be divided into the obtaining module 201, the first training module 202, the classifying module 203, the second training module 204, and the pre-judging module 205 in fig. 2, and the specific functions of each module are described in embodiment two.
Those skilled in the art will appreciate that the schematic diagram 3 is merely an example of the computer device 30 and does not constitute a limitation of the computer device 30, and may include more or less components than those shown, or combine certain components, or different components, for example, the computer device 30 may also include input and output devices, network access devices, buses, etc.
The Processor 302 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor 302 may be any conventional processor or the like, the processor 302 being the control center for the computer device 30 and connecting the various parts of the overall computer device 30 using various interfaces and lines.
The memory 301 may be used to store the computer program 303, and the processor 302 may implement various functions of the computer device 30 by running or executing the computer program or module stored in the memory 301 and calling data stored in the memory 301. The memory 301 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data created according to the use of the computer device 30, and the like. Further, the memory 301 may include a non-volatile memory, such as a hard disk, a memory, a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), at least one magnetic disk storage device, a Flash memory device, or other non-volatile solid state storage device.
The modules integrated by the computer device 30 may be stored in a computer-readable storage medium if they are implemented in the form of software functional modules and sold or used as separate products. Based on such understanding, all or part of the flow of the method according to the embodiments of the present invention may also be implemented by a computer program, which may be stored in a computer-readable storage medium, and when the computer program is executed by a processor, the steps of the method embodiments may be implemented. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying said computer program code, recording medium, U-disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM).
Further, the computer usable storage medium may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function, and the like; the storage data area may store data created according to the use of the blockchain node, and the like.
The block chain is a novel application mode of computer technologies such as distributed data storage, point-to-point transmission, a consensus mechanism, an encryption algorithm and the like. A block chain (Blockchain), which is essentially a decentralized database, is a series of data blocks associated by using a cryptographic method, and each data block contains information of a batch of network transactions, so as to verify the validity (anti-counterfeiting) of the information and generate a next block. The blockchain may include a blockchain underlying platform, a platform product service layer, an application service layer, and the like.
In the embodiments provided in the present invention, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the modules is only one logical functional division, and other divisions may be realized in practice.
The modules described as separate parts may or may not be physically separate, and parts displayed as modules may or may not be physical modules, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment.
In addition, functional modules in the embodiments of the present invention may be integrated into one processing module, or each of the modules may exist alone physically, or two or more modules are integrated into one module. The integrated module can be realized in a hardware form, and can also be realized in a form of hardware and a software functional module.
The integrated module implemented in the form of a software functional module may be stored in a computer-readable storage medium. The software functional module is stored in a storage medium and includes several instructions to enable a computer device (which may be a personal computer, a server, or a network device) or a processor (processor) to execute some steps of the methods according to the embodiments of the present invention.
It will be evident to those skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments, and that the present invention may be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference signs in the claims shall not be construed as limiting the claim concerned. Furthermore, it is to be understood that the word "comprising" does not exclude other modules or steps, and the singular does not exclude the plural. A plurality of modules or means recited in the system claims may also be implemented by one module or means in software or hardware. The terms first, second, etc. are used to denote names, but not any particular order.
Finally, it should be noted that the above embodiments are only for illustrating the technical solutions of the present invention and not for limiting, and although the present invention is described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications or equivalent substitutions may be made on the technical solutions of the present invention without departing from the spirit and scope of the technical solutions of the present invention.
Claims (10)
1. A method of computer performance data determination, the method comprising:
acquiring a plurality of historical performance data sequences of a computer;
pre-training the long-short term memory network by using the plurality of historical performance data sequences to obtain a pre-trained long-short term memory network;
classifying the plurality of historical performance data sequences to obtain N sequence types and N historical performance data sequence subsets which are in one-to-one correspondence with the N sequence types;
training a performance pre-judging model consisting of the pre-trained long-short term memory network and a full connection layer positioned behind the pre-trained long-short term memory network by using each historical performance data sequence subset to obtain N trained performance pre-judging models which correspond to the N sequence types one by one;
and prejudging the performance data of the computer through a trained performance prejudging model corresponding to the sequence type of the performance data sequence to be prejudged of the computer.
2. The method of claim 1, wherein said obtaining a plurality of historical performance data sequences for a computer comprises:
acquiring historical performance data main sequences of a plurality of computers, wherein each element in the historical performance data main sequences is a state vector of computer performance;
and intercepting a plurality of historical performance data sequences from each historical performance data main sequence by taking a preset length h as the length of a sliding window and 1 as a step length.
3. The method of claim 1, wherein said classifying the plurality of sequences of historical performance data comprises:
clustering the plurality of historical performance data sequences, and modifying clustering results according to received modification instructions; or
And classifying the plurality of historical performance data sequences by using a trained preset convolutional neural network model.
4. The method of claim 1, wherein training a performance prediction model consisting of the pre-trained long-short term memory network and a fully-connected layer behind the pre-trained long-short term memory network with each historical performance data sequence subset comprises:
judging whether the number of the historical performance data sequences is greater than a preset threshold value or not;
if the number of the historical performance data sequences is larger than a preset threshold value, optimizing parameters of a performance pre-judging model corresponding to the historical performance data sequence subset by using the historical performance data sequence subset according to a loss function;
and if the number of the historical performance data sequences is not larger than a preset threshold value, optimizing parameters of a full connection layer of a performance pre-judging model corresponding to the historical performance data sequence subset by using the historical performance data sequence subset according to a loss function.
5. The method of claim 1, wherein before said prejudging the performance data of the computer by a trained performance prejudging model corresponding to a sequence type of a performance data sequence to be prejudged of the computer, the method further comprises:
and determining the sequence type of the performance data sequence to be pre-judged according to the N historical performance data sequence subsets.
6. The method of claim 5, wherein the determining the sequence type of the performance data sequence to be predicted according to the N subsets of the historical performance data sequences comprises:
calculating N central sequences of N historical performance data sequence subsets, and determining a sequence type corresponding to the central sequence closest to the performance data sequence to be pre-judged as the target sequence type; or
The method comprises the steps of training a preset neural network by N historical performance data sequence subsets, inputting a to-be-prejudged performance data sequence into the trained preset neural network, and determining the sequence type of the to-be-prejudged performance data sequence according to the output of the trained preset neural network, wherein the label of each historical performance data sequence in each historical performance data sequence subset is the sequence type of the historical performance data sequence.
7. The method of any one of claims 1-5, further comprising:
and if the performance data prediction result of the computer is not in the preset normal state range, returning a prompt for abnormal performance of the computer.
8. An apparatus for determining computer performance data, the apparatus comprising:
the acquisition module is used for acquiring a plurality of historical performance data sequences of the computer;
the first training module is used for pre-training the long-short term memory network by using the plurality of historical performance data sequences to obtain the pre-trained long-short term memory network;
the classification module is used for classifying the plurality of historical performance data sequences to obtain N sequence types and N historical performance data sequence subsets which are in one-to-one correspondence with the N sequence types;
the second training module is used for training a performance pre-judging model consisting of the pre-trained long-short term memory network and a full connection layer positioned behind the pre-trained long-short term memory network by using each historical performance data sequence subset to obtain N trained performance pre-judging models which correspond to the N sequence types one by one;
and the prejudging module is used for prejudging the performance data of the computer through a trained performance prejudging model corresponding to the sequence type of the performance data sequence to be prejudged of the computer.
9. A computer device, characterized in that the computer device comprises a processor for executing a computer program stored in a memory for implementing the computer performance data determination method as claimed in any one of claims 1-7.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out a computer performance data determination method according to any one of claims 1 to 7.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010328656.1A CN111679959A (en) | 2020-04-23 | 2020-04-23 | Computer performance data determination method and device, computer equipment and storage medium |
PCT/CN2020/118939 WO2021212753A1 (en) | 2020-04-23 | 2020-09-29 | Computer performance data determining method and apparatus, computer device, and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010328656.1A CN111679959A (en) | 2020-04-23 | 2020-04-23 | Computer performance data determination method and device, computer equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111679959A true CN111679959A (en) | 2020-09-18 |
Family
ID=72433824
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010328656.1A Pending CN111679959A (en) | 2020-04-23 | 2020-04-23 | Computer performance data determination method and device, computer equipment and storage medium |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN111679959A (en) |
WO (1) | WO2021212753A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021212753A1 (en) * | 2020-04-23 | 2021-10-28 | 平安科技(深圳)有限公司 | Computer performance data determining method and apparatus, computer device, and storage medium |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111950928B (en) * | 2020-08-24 | 2024-02-06 | 国网冀北电力有限公司 | Loss reduction method and device for power distribution network, storage medium and computing equipment |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017185347A1 (en) * | 2016-04-29 | 2017-11-02 | 北京中科寒武纪科技有限公司 | Apparatus and method for executing recurrent neural network and lstm computations |
CN106570164A (en) * | 2016-11-07 | 2017-04-19 | 中国农业大学 | Integrated foodstuff safety text classification method based on deep learning |
CN109347668B (en) * | 2018-10-17 | 2020-11-06 | 网宿科技股份有限公司 | Training method and device for service quality assessment model |
CN109471941A (en) * | 2018-11-07 | 2019-03-15 | 中国电子科技集团公司第二十八研究所 | A kind of charge classification method for coping with class imbalance |
CN109873779B (en) * | 2019-01-30 | 2021-05-11 | 浙江工业大学 | LSTM-based hierarchical wireless signal modulation type identification method |
CN111679959A (en) * | 2020-04-23 | 2020-09-18 | 平安科技(深圳)有限公司 | Computer performance data determination method and device, computer equipment and storage medium |
-
2020
- 2020-04-23 CN CN202010328656.1A patent/CN111679959A/en active Pending
- 2020-09-29 WO PCT/CN2020/118939 patent/WO2021212753A1/en active Application Filing
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021212753A1 (en) * | 2020-04-23 | 2021-10-28 | 平安科技(深圳)有限公司 | Computer performance data determining method and apparatus, computer device, and storage medium |
Also Published As
Publication number | Publication date |
---|---|
WO2021212753A1 (en) | 2021-10-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111679949B (en) | Abnormality detection method based on equipment index data and related equipment | |
CN111695675B (en) | Federal learning model training method and related equipment | |
CN112801718B (en) | User behavior prediction method, device, equipment and medium | |
CN112528025A (en) | Text clustering method, device and equipment based on density and storage medium | |
CN111461168A (en) | Training sample expansion method and device, electronic equipment and storage medium | |
CN110599335A (en) | User financial risk assessment method and device based on multiple models | |
CN111460893B (en) | Face feature vector dynamic adjustment method and related equipment | |
CN115237802A (en) | Artificial intelligence based simulation test method and related equipment | |
CN112800178A (en) | Answer generation method and device, electronic equipment and readable storage medium | |
CN112988840A (en) | Time series prediction method, device, equipment and storage medium | |
CN111679959A (en) | Computer performance data determination method and device, computer equipment and storage medium | |
CN112269875A (en) | Text classification method and device, electronic equipment and storage medium | |
CN111475541A (en) | Data decision method and device, electronic equipment and storage medium | |
US11227231B2 (en) | Computational efficiency in symbolic sequence analytics using random sequence embeddings | |
CN115222443A (en) | Client group division method, device, equipment and storage medium | |
CN113591881A (en) | Intention recognition method and device based on model fusion, electronic equipment and medium | |
CN112036439B (en) | Dependency relationship classification method and related equipment | |
CN112766995B (en) | Article recommendation method, device, terminal equipment and storage medium | |
CN112053058A (en) | Index model generation method and device | |
CN111400440A (en) | Intention identification method and device | |
CN113780675B (en) | Consumption prediction method and device, storage medium and electronic equipment | |
CN116796140A (en) | Abnormal analysis method, device, equipment and storage medium based on artificial intelligence | |
CN116245630A (en) | Anti-fraud detection method and device, electronic equipment and medium | |
CN114897099A (en) | User classification method and device based on passenger group deviation smooth optimization and electronic equipment | |
CN114385878A (en) | Visual display method and device for government affair data and terminal equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
REG | Reference to a national code |
Ref country code: HK Ref legal event code: DE Ref document number: 40031278 Country of ref document: HK |
|
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |