CN110851577A - Knowledge graph expansion method and device in electric power field - Google Patents
Knowledge graph expansion method and device in electric power field Download PDFInfo
- Publication number
- CN110851577A CN110851577A CN201911044753.1A CN201911044753A CN110851577A CN 110851577 A CN110851577 A CN 110851577A CN 201911044753 A CN201911044753 A CN 201911044753A CN 110851577 A CN110851577 A CN 110851577A
- Authority
- CN
- China
- Prior art keywords
- isa
- entities
- entity
- knowledge graph
- data set
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 24
- 239000013598 vector Substances 0.000 claims description 47
- 238000012549 training Methods 0.000 claims description 25
- 238000012360 testing method Methods 0.000 claims description 13
- 238000009826 distribution Methods 0.000 claims description 11
- 230000015572 biosynthetic process Effects 0.000 claims description 10
- 230000003416 augmentation Effects 0.000 claims description 3
- 238000004364 calculation method Methods 0.000 claims description 3
- 230000008569 process Effects 0.000 claims description 3
- 230000003190 augmentative effect Effects 0.000 claims 1
- 238000012423 maintenance Methods 0.000 abstract 1
- 239000013589 supplement Substances 0.000 abstract 1
- 238000010586 diagram Methods 0.000 description 8
- 238000004590 computer program Methods 0.000 description 7
- 230000006870 function Effects 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 238000003860 storage Methods 0.000 description 3
- 230000008859 change Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000002372 labelling Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000003062 neural network model Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/30—Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
- G06F16/33—Querying
- G06F16/332—Query formulation
- G06F16/3329—Natural language query formulation or dialogue systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/30—Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
- G06F16/35—Clustering; Classification
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/30—Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
- G06F16/36—Creation of semantic tools, e.g. ontology or thesauri
- G06F16/367—Ontology
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/06—Energy or water supply
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- General Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Health & Medical Sciences (AREA)
- Economics (AREA)
- Computational Linguistics (AREA)
- Mathematical Physics (AREA)
- General Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Marketing (AREA)
- Primary Health Care (AREA)
- Strategic Management (AREA)
- Tourism & Hospitality (AREA)
- General Business, Economics & Management (AREA)
- Human Resources & Organizations (AREA)
- Human Computer Interaction (AREA)
- Water Supply & Treatment (AREA)
- Public Health (AREA)
- Life Sciences & Earth Sciences (AREA)
- Animal Behavior & Ethology (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
The invention discloses a knowledge graph expansion method and device in the power field. The method provides more knowledge supplements in scenes aiming at the knowledge graph, improves the depth of the power marketing knowledge graph, and reduces the related cost brought by manual maintenance of the graph.
Description
Technical Field
The invention belongs to the technical field of power system customer service, and particularly relates to a knowledge graph expansion method and device in the power field.
Background
In the existing customer service question-answering system in the power industry, the reasoning question-answering accuracy is low, and the improvement of the reasoning question is the key for improving the intelligent question-answering performance of the power customer service, so that knowledge hidden or hidden in the knowledge map needs to be found to enrich the knowledge map so as to meet the requirement on knowledge in answering the reasoning question. The knowledge mainly comprises hidden knowledge under the relations of upper and lower positions, parts and whole, equivalence and the like among domain entities, for example, one way of paying the electric charge is that a paying treasure 'secretly expresses that one way of paying the electric charge is an electronic payment channel', and the association knowledge of regularity hidden among the relations or the entities. Therefore, the hidden and implicit knowledge in the knowledge map of the power customer service field needs to be expanded to deal with the question and answer of reasoning problems, which is one of the key and difficulty of intelligent question and answer research of the power customer service. At present, knowledge for reasoning, such as upper and lower relations, parts and integrals, equivalence, and the like, is lacked in the constructed map.
Disclosure of Invention
The invention aims to provide a knowledge graph expanding method and a knowledge graph expanding device in the power field, which can be used for enriching the knowledge graph by finding out the knowledge hidden or hidden in the knowledge graph so as to meet the requirement on the knowledge in answering reasoning problems.
In order to achieve the purpose, the technical scheme adopted by the invention is as follows:
the embodiment of the invention provides a knowledge graph expanding method in the field of electric power, which comprises the following steps:
acquiring a candidate entity;
inquiring a word vector corresponding to the candidate entity;
calculating the difference value of the word vectors of every two entities in the candidate entities to obtain vector deviation;
classifying the obtained vector deviation by adopting a trained TEXTCNN classifier to obtain whether the two entities have an isA relation or not;
and extracting entity pairs classified into the isA relationships and expanding the knowledge graph.
Further, the obtaining of the candidate entity includes:
the candidate entities are obtained by carrying out entity recognition on the text to be extracted through a power domain named entity recognition module,
or selecting an entity from the knowledge graph as a candidate entity.
Further, the querying a word vector corresponding to the candidate entity includes:
performing vocabulary distribution representation learning on the entities in the knowledge graph to obtain a word vector corresponding to each entity;
and inquiring word vectors corresponding to the candidate entities from the vocabulary distribution representation learning result.
Further, the vocabulary distribution representation learning employs a Skip-gram model.
Further, the TEXTCNN classifier training process is as follows:
extracting the isA relation and notisA relation in the knowledge graph as a data set; each record in the data set consists of two entities and whether the two entities have an isA relationship;
dividing the data set into a training data set D and a testing data set T; the training dataset D is used to train a TEXTCNN classifier, and the test dataset T is used to evaluate the TEXTCNN classifier;
and taking the vector deviation of the two entities recorded in each training data set D as an input characteristic, inputting the input characteristic into a TEXTCNN classifier, and outputting whether the two entities have an isA relationship, wherein Y represents the formation of the isA relationship, and N represents the non-formation of the isA relationship.
Furthermore, after the isA relation and notisA relation in the knowledge graph are extracted, the data set is labeled and corrected.
Further, the extracting the entity pair extended knowledge graph classified as the isA relationship includes:
more isA relationships in the knowledge graph are deduced according to the ontology axiom "isA (X, Y), isA (Y, Z) → isA (X, Z)", wherein isA (X, Y) indicates that entity X and entity Y constitute an isA relationship.
The embodiment of the invention also provides a knowledge graph expanding device in the electric power field, which comprises:
the acquisition module is used for acquiring candidate entities;
the query module is used for querying the word vectors corresponding to the candidate entities; a
The calculation module is used for calculating the difference value of the word vectors of every two entities in the candidate entities to obtain vector deviation;
the classification module is used for classifying the obtained vector deviation by adopting a TEXTCNN classifier to obtain whether the two entities have an isA relation;
and an expansion module for extracting the extended knowledge graph of the entity pairs classified as the isA relationship.
Further, the acquisition module, in particular for,
the candidate entities are obtained by carrying out entity recognition on the text to be extracted through a power domain named entity recognition module,
or selecting an entity from the knowledge graph as a candidate entity.
Further, the classification module is, in particular,
extracting the isA relation and notisA relation in the knowledge graph as a data set; each record in the data set consists of two entities and whether the two entities have an isA relationship;
dividing the data set into a training data set D and a testing data set T; the training dataset D is used to train a TEXTCNN classifier, and the test dataset T is used to evaluate the TEXTCNN classifier;
and taking the vector deviation of the two entities recorded in each training data set D as an input characteristic, inputting the input characteristic into a TEXTCNN classifier, and outputting whether the two entities have an isA relationship, wherein Y represents the formation of the isA relationship, and N represents the non-formation of the isA relationship.
Further, the expansion module is specifically configured to derive more isA relationships in the knowledge graph according to the ontology axiom "isA (X, Y), isA (Y, Z) → isA (X, Z)", where isA (X, Y) indicates that entity X and entity Y form an isA relationship.
According to the method, scenes of knowledge in the power field are preset, after manual review and marking, the Skip-gram model is used for training and learning, classification training of intention scenes is further completed, and finally expansion of knowledge maps in the power field is achieved.
Drawings
FIG. 1 is a general flowchart of the knowledge-graph augmentation method of the present invention.
Detailed Description
The invention is further described below. The following examples are only for illustrating the technical solutions of the present invention more clearly, and the protection scope of the present invention is not limited thereby.
The invention provides a knowledge graph expansion method in the field of electric power, which specifically comprises the following steps:
step 1: and constructing a training data set D and a prediction data set T by utilizing the existing power knowledge graph.
And (3) predefining some intention scenes (such as membership, existence, possessing and other specific relation classes) for the power knowledge scene to prepare data, labeling the data, and segmenting the training data set and the prediction data set.
The IsA relation is a most core relation in the knowledge graph, and defines that a certain concept B is a kind of concept A, such as: the value-added tax invoice is an (isA) invoicing service, and the value-added tax invoice is also an (isA) invoicing service. The notisA relationship is the opposite of the isA relationship, which means that a certain concept B is not a kind of A, e.g. the electronic invoice is not a (notisA) meter.
In this step, the isA and notisA relationships in the knowledge-graph are extracted as a data set, and then the data set is manually labeled and corrected. Each record in the annotated dataset consists of two entities and whether the two entities have an isA relationship, such as: (value added tax general invoice, invoicing, Y), Y stands for the possible ISA relationships.
The data set is then divided into two parts: a training data set D and a test data set T. The training data set D is primarily used to train the isA relational classifier that judges the isA relationship, while the test data set T is used to evaluate the performance of the isA relational classifier.
Step 2: training isA relationship classifier
In the step, the input of the neural network model to be trained is a plurality of pairs of entities, and the output is whether each pair of entities can form an isA relationship. The model is divided into two parts: the vocabulary distribution represents the learning and TEXTCNN classifiers.
The TEXTCNN classifier is trained using the training data set D in step 1, and the test data set T evaluates the TEXTCNN classifier.
Vocabulary distribution representation learning: mapping the entities in the knowledge-graph into vectors. Vocabulary distribution expression learning uses a Skip-gram model, is input as a corpus in the electric power field, and outputs a numeric word vector. The content of the corpus in the power domain is a text, and the output is word vectors of all words in the text, for example, if there are three words in the text, word vectors of three words are output. A word vector is a set of decimal values between 0 and 1, for example: the word vector of the electricity rate may be (0.1, 0.2, 0.33, 0.5, 0.6), saying that a word is represented by a set of numbers, which is referred to as a word vector.
For the electric power field text in the invention, a context window and a parameter α value are set through experiments to change the distribution of word vectors, and the word vectors are adjusted to a reasonable position to avoid overfitting.
TEXTCNN classifier: using TEXTCNN as a classifier for dividing deviation vectors, using vector deviation as an input feature, and outputting whether two entities can form an isA relation.
TABLE 1 TEXTCNN parameter List
Parameter name | Parameter value |
Batch size | 64 |
word embedding size | 200 |
kernel size | 128 |
Filter Window | 2,3,4,5 |
And step 3: extracting isA relationships
And (3) carrying out entity recognition on the text to be extracted through a power field named entity recognition module or selecting entities from a knowledge graph spectrum to obtain candidate entities, and then judging whether the candidate entities can form the isA relation by using the isA relation classifier trained in the step (2). The specific implementation steps are as follows:
inquiring word vectors corresponding to the entities from the vocabulary distribution representation learning result;
calculating a word vector difference value between every two entities to obtain a vector deviation;
using a TEXTCNN classifier to perform secondary classification on the isA and notisA according to vector deviation;
and the classification result is that the relation between the two entities corresponding to the vector deviation of the isA is the extraction target.
And 4, step 4: expanding the knowledge graph according to the extracted isA relation
I.e., combining ontology axiom "isA (X, Y), isA (Y, Z) → isA (X, Z)", to deduce more isA relationships in the knowledge-graph. For example, the isA relationship extracted in step 3 is that the single-phase electric energy meter is one of the alternating current electric energy meters, and the known relationship is that the alternating current electric energy meter is one of the electric energy meters, so that the single-phase electric energy meter can be inferred to be one of the electric energy meters, and the inferred relationship can be stored in the knowledge graph, thereby enriching the content of the knowledge graph.
The embodiment of the invention also provides a knowledge graph expanding device in the electric power field, which comprises:
the acquisition module is used for acquiring candidate entities;
the query module is used for querying the word vectors corresponding to the candidate entities; a
The calculation module is used for calculating the difference value of the word vectors of every two entities in the candidate entities to obtain vector deviation;
the classification module is used for classifying the obtained vector deviation by adopting a TEXTCNN classifier to obtain whether the two entities have an isA relation;
and an expansion module for extracting the extended knowledge graph of the entity pairs classified as the isA relationship.
Further, the acquisition module, in particular for,
the candidate entities are obtained by carrying out entity recognition on the text to be extracted through a power domain named entity recognition module,
or selecting an entity from the knowledge graph as a candidate entity.
Further, the classification module is, in particular,
extracting the isA relation and notisA relation in the knowledge graph as a data set; each record in the data set consists of two entities and whether the two entities have an isA relationship;
dividing the data set into a training data set D and a testing data set T; the training dataset D is used to train a TEXTCNN classifier, and the test dataset T is used to evaluate the TEXTCNN classifier;
and taking the vector deviation of the two entities recorded in each training data set D as an input characteristic, inputting the input characteristic into a TEXTCNN classifier, and outputting whether the two entities have an isA relationship, wherein Y represents the formation of the isA relationship, and N represents the non-formation of the isA relationship.
Further, the expansion module is specifically configured to derive more isA relationships in the knowledge graph according to the ontology axiom "isA (X, Y), isA (Y, Z) → isA (X, Z)", where isA (X, Y) indicates that entity X and entity Y form an isA relationship.
It is to be noted that the apparatus embodiment corresponds to the method embodiment, and the implementation manners of the method embodiment are all applicable to the apparatus embodiment and can achieve the same or similar technical effects, so that the details are not described herein.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solutions of the present invention and not for limiting the same, and although the present invention is described in detail with reference to the above embodiments, those of ordinary skill in the art should understand that: modifications and equivalents may be made to the embodiments of the invention without departing from the spirit and scope of the invention, which is to be covered by the claims.
Claims (11)
1. A knowledge graph expansion method in the power field is characterized by comprising the following steps:
acquiring a candidate entity;
inquiring a word vector corresponding to the candidate entity;
calculating the difference value of the word vectors of every two entities in the candidate entities to obtain vector deviation;
classifying the obtained vector deviation by adopting a trained TEXTCNN classifier to obtain whether the two entities have an isA relation or not;
and extracting entity pairs classified into the isA relationships and expanding the knowledge graph.
2. The method for knowledge-graph augmentation in the power domain according to claim 1, wherein the obtaining of the candidate entities comprises:
the candidate entities are obtained by carrying out entity recognition on the text to be extracted through a power domain named entity recognition module,
or selecting an entity from the knowledge graph as a candidate entity.
3. The method for expanding knowledge graph of electric power field according to claim 1, wherein the querying the word vector corresponding to the candidate entity comprises:
performing vocabulary distribution representation learning on the entities in the knowledge graph to obtain a word vector corresponding to each entity;
and inquiring word vectors corresponding to the candidate entities from the vocabulary distribution representation learning result.
4. The method of claim 3, wherein the vocabulary distribution representation learning adopts a Skip-gram model.
5. The method of claim 1, wherein the TEXTCNN classifier training process comprises:
extracting the isA relation and notisA relation in the knowledge graph as a data set; each record in the data set consists of two entities and whether the two entities have an isA relationship;
dividing the data set into a training data set D and a testing data set T; the training dataset D is used to train a TEXTCNN classifier, and the test dataset T is used to evaluate the TEXTCNN classifier;
and taking the vector deviation of the two entities recorded in each training data set D as an input characteristic, inputting the input characteristic into a TEXTCNN classifier, and outputting whether the two entities have an isA relationship, wherein Y represents the formation of the isA relationship, and N represents the non-formation of the isA relationship.
6. The method for expanding the knowledge graph of the electric power field according to claim 5, wherein after the isA relation and the notisA relation in the knowledge graph are extracted, the data set is labeled and corrected.
7. The knowledge graph expansion method for the power field according to claim 1, wherein the extracting the entity pair expansion knowledge graph classified as the isA relationship comprises:
more isA relationships in the knowledge graph are deduced according to the ontology axiom "isA (X, Y), isA (Y, Z) → isA (X, Z)", wherein isA (X, Y) indicates that entity X and entity Y constitute an isA relationship.
8. A knowledge graph extending apparatus in a power domain, comprising:
the acquisition module is used for acquiring candidate entities;
the query module is used for querying the word vectors corresponding to the candidate entities; a
The calculation module is used for calculating the difference value of the word vectors of every two entities in the candidate entities to obtain vector deviation;
the classification module is used for classifying the obtained vector deviation by adopting a TEXTCNN classifier to obtain whether the two entities have an isA relation;
and an expansion module for extracting the extended knowledge graph of the entity pairs classified as the isA relationship.
9. The knowledge-graph augmentation apparatus in the electric power field of claim 8, wherein the acquisition module is specifically configured to,
the candidate entities are obtained by carrying out entity recognition on the text to be extracted through a power domain named entity recognition module,
or selecting an entity from the knowledge graph as a candidate entity.
10. The knowledge-graph augmenting apparatus of claim 8, wherein the classification module is further configured to,
extracting the isA relation and notisA relation in the knowledge graph as a data set; each record in the data set consists of two entities and whether the two entities have an isA relationship;
dividing the data set into a training data set D and a testing data set T; the training dataset D is used to train a TEXTCNN classifier, and the test dataset T is used to evaluate the TEXTCNN classifier;
and taking the vector deviation of the two entities recorded in each training data set D as an input characteristic, inputting the input characteristic into a TEXTCNN classifier, and outputting whether the two entities have an isA relationship, wherein Y represents the formation of the isA relationship, and N represents the non-formation of the isA relationship.
11. The knowledge-graph expansion apparatus of electric power domain according to claim 8, wherein the expansion module is specifically configured to derive more isA relationships in the knowledge-graph according to ontology axiom "isA (X, Y), isA (Y, Z) → isA (X, Z)", where isA (X, Y) represents that entity X and entity Y constitute isA relationships.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911044753.1A CN110851577A (en) | 2019-10-30 | 2019-10-30 | Knowledge graph expansion method and device in electric power field |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911044753.1A CN110851577A (en) | 2019-10-30 | 2019-10-30 | Knowledge graph expansion method and device in electric power field |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110851577A true CN110851577A (en) | 2020-02-28 |
Family
ID=69599083
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911044753.1A Pending CN110851577A (en) | 2019-10-30 | 2019-10-30 | Knowledge graph expansion method and device in electric power field |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110851577A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111966836A (en) * | 2020-08-29 | 2020-11-20 | 深圳呗佬智能有限公司 | Knowledge graph vector representation method and device, computer equipment and storage medium |
CN112183878A (en) * | 2020-10-13 | 2021-01-05 | 东北大学 | Power load prediction method combining knowledge graph and neural network |
CN113761207A (en) * | 2021-09-14 | 2021-12-07 | 广州汇通国信科技有限公司 | Power grid data classification method and device based on textCNN model and knowledge graph |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106776711A (en) * | 2016-11-14 | 2017-05-31 | 浙江大学 | A kind of Chinese medical knowledge mapping construction method based on deep learning |
CN108563653A (en) * | 2017-12-21 | 2018-09-21 | 清华大学 | A kind of construction method and system for knowledge acquirement model in knowledge mapping |
-
2019
- 2019-10-30 CN CN201911044753.1A patent/CN110851577A/en active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106776711A (en) * | 2016-11-14 | 2017-05-31 | 浙江大学 | A kind of Chinese medical knowledge mapping construction method based on deep learning |
CN108563653A (en) * | 2017-12-21 | 2018-09-21 | 清华大学 | A kind of construction method and system for knowledge acquirement model in knowledge mapping |
Non-Patent Citations (1)
Title |
---|
邵发: "基于词向量和深度卷积神经网络的领域实体关系抽取", 《中国优秀硕士学位论文全文数据库 (信息科技辑)》 * |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111966836A (en) * | 2020-08-29 | 2020-11-20 | 深圳呗佬智能有限公司 | Knowledge graph vector representation method and device, computer equipment and storage medium |
CN112183878A (en) * | 2020-10-13 | 2021-01-05 | 东北大学 | Power load prediction method combining knowledge graph and neural network |
CN112183878B (en) * | 2020-10-13 | 2024-10-18 | 东北大学 | Power load prediction method combining knowledge graph and neural network |
CN113761207A (en) * | 2021-09-14 | 2021-12-07 | 广州汇通国信科技有限公司 | Power grid data classification method and device based on textCNN model and knowledge graph |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106845717B (en) | Energy efficiency evaluation method based on multi-model fusion strategy | |
CN103885937B (en) | Method for judging repetition of enterprise Chinese names on basis of core word similarity | |
CN106776538A (en) | The information extracting method of enterprise's noncanonical format document | |
CN108595696A (en) | A kind of human-computer interaction intelligent answering method and system based on cloud platform | |
CN102722713B (en) | Handwritten numeral recognition method based on lie group structure data and system thereof | |
CN103823896A (en) | Subject characteristic value algorithm and subject characteristic value algorithm-based project evaluation expert recommendation algorithm | |
CN110851577A (en) | Knowledge graph expansion method and device in electric power field | |
CN105740404A (en) | Label association method and device | |
CN110705272A (en) | Named entity identification method for automobile engine fault diagnosis | |
CN104850905A (en) | Machine-learning-based legal risk quantitative evaluation system and method | |
CN105117740A (en) | Font identification method and device | |
CN110222129B (en) | Credit evaluation algorithm based on relational database | |
CN113486664A (en) | Text data visualization analysis method, device, equipment and storage medium | |
CN104317946A (en) | Multi-key image-based image content retrieval method | |
CN104281694A (en) | Analysis system of emotional tendency of text | |
CN111930937A (en) | BERT-based intelligent government affair text multi-classification method and system | |
CN106202203A (en) | The method for building up of bug knowledge base based on lifelong topic model | |
CN110399432A (en) | A kind of classification method of table, device, computer equipment and storage medium | |
CN111340645A (en) | Improved correlation analysis method for power load | |
CN117235137B (en) | Professional information query method and device based on vector database | |
CN112738724B (en) | Method, device, equipment and medium for accurately identifying regional target crowd | |
CN113505863B (en) | Image multistage classification method and system based on cascade mean vector comprehensive scoring | |
CN115358473A (en) | Power load prediction method and prediction system based on deep learning | |
CN115936003A (en) | Software function point duplicate checking method, device, equipment and medium based on neural network | |
CN104462458A (en) | Data mining method of big data system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20200228 |