CN118133010B - Graph model-based manufacturing cloud service recommendation model training method and recommendation method - Google Patents
Graph model-based manufacturing cloud service recommendation model training method and recommendation method Download PDFInfo
- Publication number
- CN118133010B CN118133010B CN202410200406.8A CN202410200406A CN118133010B CN 118133010 B CN118133010 B CN 118133010B CN 202410200406 A CN202410200406 A CN 202410200406A CN 118133010 B CN118133010 B CN 118133010B
- Authority
- CN
- China
- Prior art keywords
- recommendation
- model
- positive
- inverse
- training
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000012549 training Methods 0.000 title claims abstract description 183
- 238000000034 method Methods 0.000 title claims abstract description 119
- 238000004519 manufacturing process Methods 0.000 title claims abstract description 108
- 238000013507 mapping Methods 0.000 claims abstract description 104
- 238000012545 processing Methods 0.000 claims abstract description 43
- 238000013506 data mapping Methods 0.000 claims abstract description 8
- 230000008569 process Effects 0.000 claims description 48
- 230000006870 function Effects 0.000 claims description 35
- 238000003860 storage Methods 0.000 claims description 29
- 239000013598 vector Substances 0.000 description 20
- 238000004891 communication Methods 0.000 description 10
- 238000010586 diagram Methods 0.000 description 9
- 238000013527 convolutional neural network Methods 0.000 description 7
- 230000000694 effects Effects 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 6
- 230000003287 optical effect Effects 0.000 description 6
- 230000006399 behavior Effects 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 4
- 238000004590 computer program Methods 0.000 description 4
- 230000005236 sound signal Effects 0.000 description 4
- 230000008859 change Effects 0.000 description 3
- 239000000284 extract Substances 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 230000021615 conjugation Effects 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 238000007726 management method Methods 0.000 description 2
- 238000003672 processing method Methods 0.000 description 2
- 230000001737 promoting effect Effects 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 230000002776 aggregation Effects 0.000 description 1
- 238000004220 aggregation Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 239000012633 leachable Substances 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000012216 screening Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/95—Retrieval from the web
- G06F16/953—Querying, e.g. by the use of web search engines
- G06F16/9535—Search customisation based on user profiles and personalisation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
- G06N3/0455—Auto-encoder networks; Encoder-decoder networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/0464—Convolutional networks [CNN, ConvNet]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Computation (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- Computing Systems (AREA)
- Software Systems (AREA)
- Molecular Biology (AREA)
- Computational Linguistics (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Mathematical Physics (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Databases & Information Systems (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Biology (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
The application provides a graph model-based manufacturing cloud service recommendation model training method and a recommendation method, and relates to the technical field of data processing. The training method comprises the following steps: acquiring sample data; the sample data comprises positive recommendation sample data and inverse recommendation sample data, an inverse relation conjugated with the positive relation in the manufacturing cloud service recommendation system is obtained, and a mapping relation between the positive relation and the inverse relation is generated; performing data mapping processing on the inverse recommendation sample data according to the mapping function, and generating inverse recommendation model training data corresponding to the inverse recommendation sample data according to the mapped sample data and the recommendation model to be trained; generating positive recommendation model training data corresponding to the positive recommendation sample data according to the recommendation model and the positive recommendation sample data; and training the recommendation model based on the positive recommendation model training data and the inverse recommendation model training data to obtain the trained manufacturing cloud service recommendation system. The method improves the training efficiency of the recommendation model containing the forward and reverse relations.
Description
Technical Field
The application relates to the technical field of data processing, in particular to a graph model-based manufacturing cloud service recommendation model training method and a recommendation method.
Background
The cloud manufacturing is a new network manufacturing mode manufacturing cloud service (cloud service for short) which utilizes a network and a cloud manufacturing service platform to organize network manufacturing resources according to the demands of users and provides various manufacturing services according to the demands of the users, is a basic element for manufacturing the cloud platform, and is virtualized and served manufacturing resources and manufacturing capacity on the manufacturing cloud platform. The manufacturing cloud service encapsulates and accesses the scattered manufacturing resources into a cloud platform through the technologies of virtualization, internet of things and the like to form manufacturing cloud; through manufacturing the cloud, a user can acquire the required personalized service as required, and quickly responds to the changing market demand with low cost, so that the enterprise becomes more agile and intelligent; thereby promoting specialized division and flexible business cooperation, and promoting manufacturing resources in a wide area range.
The recommendation system for manufacturing cloud service is generally a graph model, uses data such as historical behaviors of users, various demands and specific information of schemes as nodes (sometimes also called entities) in the graph, and uses interrelationships among the nodes as edges to construct a directed graph so as to facilitate storage. In general, a recommendation system will provide services or solutions to clients from demand, and thus may refer to a recommendation process from demand to a specific service or solution as a positive recommendation. The positive relationship is the relationship actually recorded in the graph model, and the direction of the positive relationship is also the direction of the relationship itself, i.e. the potential tail entity is predicted by the head entity. However, when a user needs to be recommended for a certain service, that is, a relationship between the service and each requirement needs to be studied, the direction of the relationship is exactly opposite to the positive relationship defined above, so that the relationship can be called as an inverse relationship, and further, the inverse relationship recommendation of a specific service or scheme to the requirement is performed based on the inverse relationship. In a recommendation system for manufacturing cloud services, the inverse and positive relationships exist in conjugation, which essentially represent the same edge in the graph, but the inverse relationship recommendation is reversed to predict potential head entities from tail entities.
Currently, various manufacturing cloud service recommendation systems are generally designed for positive relations and recommended for the positive relations. For the recommendation of the inverse relationship, three existing processing methods are generally adopted: the first method considers the conjugate of the inverse relationship, that is, the corresponding positive relationship, traverses all other nodes, and calculates whether they can establish the link of the positive relationship with the target node, if so, it is equivalent to establishing the inverse relationship at the same time, but this method needs to traverse for each node in the graph once, greatly reducing the efficiency; the second method imitates the positive relation reasoning method, additionally establishes a new model which is specially used for the inverse relation reasoning, but the method needs to additionally train the new model and additionally occupy space to store new relation, and ignores the relation between the positive relation and the inverse relation; the third method regards the inverse relationship as a brand new positive relationship, and then processes the inverse relationship with the same model for processing the positive relationship, which saves time and labor, but ignores the relationship between the positive relationship and the inverse relationship.
Disclosure of Invention
The application provides a graph model-based manufacturing cloud service recommendation model training method and a graph model-based manufacturing cloud service recommendation model recommendation method, which are used for solving the technical problems that in the prior art, the model needs additional storage space to store forward and reverse relations respectively, and the training efficiency is low due to the fact that the recommendation models corresponding to the two relations are trained respectively.
In a first aspect, the present application provides a graph model-based manufacturing cloud service recommendation model training method, including:
acquiring sample data; the sample data comprise forward recommendation sample data and reverse recommendation sample data for training the manufacturing cloud service recommendation system, so that the trained manufacturing cloud service recommendation system supports forward relationship recommendation and reverse relationship recommendation simultaneously;
acquiring an inverse relation conjugated with a positive relation in a manufacturing cloud service recommendation system, and generating a mapping relation between the positive relation and the inverse relation;
performing data mapping processing on the inverse recommendation sample data according to the mapping relation, and generating inverse recommendation model training data corresponding to the inverse recommendation sample data according to the mapped sample data and a recommendation model to be trained;
Generating positive recommendation model training data corresponding to the positive recommendation sample data according to the recommendation model and the positive recommendation sample data;
and training the recommendation model based on the forward recommendation model training data and the reverse recommendation model training data to obtain a trained manufacturing cloud service recommendation system.
Optionally, the acquiring sample data includes:
Acquiring a pre-constructed graph model, and generating multiple groups of triple data based on two adjacent nodes in the graph model and edges between the two adjacent nodes;
And for any group of triple data, acquiring a head entity and a positive relation in the triple data as the positive recommendation sample data, and acquiring a tail entity and a positive relation in the triple data as the negative recommendation sample data.
Optionally, the mapping relationship includes:
r-1=d(r);
Wherein r -1 represents an inverse relationship; r represents a positive relationship; d (r) represents a mapping function between the positive relationship and the inverse relationship.
Optionally, the mapping mode includes linear model mapping; correspondingly, the mapping function includes:
d(r)=Wr+b;
Wherein d (r) represents an inverse relationship; r represents a positive relationship; w represents the mapping weight; b represents the bias of the linear layer;
The mapping mode comprises multi-layer perceptron mapping; correspondingly, the mapping function includes:
d(r)=MLP(r);
Wherein d (r) represents an inverse relationship; r represents a positive relationship; MLP () represents a model structure;
The mapping mode comprises convolution model mapping; correspondingly, the mapping function includes:
d(r)=Conv(r);
wherein d (r) represents an inverse relationship; r represents a positive relationship; conv () represents the model structure;
the mapping mode comprises attention model mapping; correspondingly, the mapping function includes:
d(r)=α(r)⊙r;
wherein d (r) represents an inverse relationship; r represents a positive relationship; alpha (r) represents an attention weight; the ". Iy represents Hadamard product.
Optionally, the inverse recommendation model training data includes:
h=f(t,d(r));
Wherein f () represents the model structure of the recommendation model; h represents a head entity, namely a training label in the reverse recommendation training process; t represents a tail entity, namely an input entity in the reverse recommendation process; d (r) represents an inverse relationship, namely an input relationship value in the inverse recommendation training process;
the positive recommendation model training data includes:
t=f(h,r);
wherein f () represents the model structure of the recommendation model; t represents a tail entity, namely a training label in the process of recommending training; h represents a head entity, namely an input entity in the process of being recommended; r represents a positive relationship, i.e. an input relationship value in the training process is being recommended.
In a second aspect, the present application provides a manufacturing cloud service recommendation method based on a graph model, including:
Acquiring information to be processed; the information to be processed comprises user requirements or services to be recommended;
Inputting the user demand or the service to be recommended into a pre-trained manufacturing cloud service recommendation system, so that the recommendation system outputs a target recommendation service corresponding to the user demand when the information to be processed is the user demand; or outputting a target user corresponding to the service to be recommended when the information to be processed is the service to be recommended; the manufacturing cloud service recommendation system is obtained by training the manufacturing cloud service recommendation model training method based on the graph model according to any one of the first aspect.
In a third aspect, the present application provides a manufacturing cloud service recommendation model training apparatus based on a graph model, the apparatus comprising:
the sample data acquisition module is used for acquiring sample data; the sample data comprise forward recommendation sample data and reverse recommendation sample data for training the manufacturing cloud service recommendation system, so that the trained manufacturing cloud service recommendation system supports forward relationship recommendation and reverse relationship recommendation simultaneously;
the mapping relation generation module is used for acquiring the inverse relation conjugated with the positive relation in the manufacturing cloud service recommendation system and generating a mapping relation between the positive relation and the inverse relation;
The first training data generation module is used for carrying out data mapping processing on the inverse recommendation sample data according to the mapping relation, and generating inverse recommendation model training data corresponding to the inverse recommendation sample data according to the sample data subjected to the mapping processing and a recommendation model to be trained;
The second training data generation module is used for generating positive recommendation model training data corresponding to the positive recommendation sample data according to the recommendation model and the positive recommendation sample data;
And the training module is used for training the recommendation model based on the positive recommendation model training data and the inverse recommendation model training data to obtain a manufacturing cloud service recommendation system after training.
In a fourth aspect, the present application provides a manufacturing cloud service recommendation device based on a graph model, the device comprising:
the information acquisition module is used for acquiring information to be processed; the information to be processed comprises user requirements or services to be recommended;
The recommendation processing module is used for inputting the user demands or the to-be-recommended services into a pre-trained manufacturing cloud service recommendation system, so that the recommendation system outputs target recommendation services corresponding to the user demands when the to-be-processed information is the user demands; or outputting a target user corresponding to the service to be recommended when the information to be processed is the service to be recommended; the manufacturing cloud service recommendation system is obtained by training the manufacturing cloud service recommendation model training method based on the graph model according to any one of the first aspect.
In a fifth aspect, the present application provides an electronic device, comprising: a processor, and a memory communicatively coupled to the processor;
the memory stores computer-executable instructions;
The processor executes computer-executable instructions stored by the memory to implement the methods as described in the first and second aspects.
In a sixth aspect, the present application provides a computer readable storage medium having stored therein computer executable instructions which when executed by a processor are adapted to carry out the methods of the first and second aspects.
In a seventh aspect, the present application provides a computer program product comprising a computer program which, when executed by a processor, implements the methods of the first and second aspects.
According to the technical scheme, when the recommendation model is trained, positive recommendation sample data and reverse recommendation sample data are respectively obtained, so that a mapping relation between a positive relation and a reverse relation conjugated with the positive relation in a manufacturing cloud service recommendation system is constructed and obtained, and further the reverse recommendation sample data are processed according to the mapping relation, so that the processed data and the positive recommendation sample data can be used for training the recommendation model together, the single recommendation model is trained by the sample data, excessive parameters are not required to be additionally increased, the storage space occupation of the model is reduced, and the training efficiency of the recommendation model is improved; and further, model training data corresponding to the forward relation and the reverse relation are respectively formed based on the pre-constructed model frame and the sample data, so that the trained manufacturing cloud service recommendation system simultaneously supports the forward relation recommendation and the reverse relation recommendation, and further realizes the processing of the reverse relation recommendation on the basis of adopting a single recommendation model frame to realize the good processing effect of the forward relation recommendation, thereby improving the accuracy and universality of the model recommendation.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application.
FIG. 1 is a view of an application scenario provided by the present application;
fig. 2 is a schematic flow chart of a method for training a manufacturing cloud service recommendation model based on a graph model according to an embodiment of the present application;
FIG. 3 is a schematic flow chart of training data for generating a recommendation model according to an embodiment of the present application;
FIG. 4 is a schematic flow chart of a method for manufacturing cloud service recommendation model based on a graph model according to an embodiment of the present application;
Fig. 5 is a schematic structural diagram of a manufacturing cloud service recommendation model training device based on a graph model according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of a manufacturing cloud service recommendation device based on a graph model according to an embodiment of the present application;
Fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 8 is a block diagram of an electronic device according to an embodiment of the present application.
Specific embodiments of the present application have been shown by way of the above drawings and will be described in more detail below. The drawings and the written description are not intended to limit the scope of the inventive concepts in any way, but rather to illustrate the inventive concepts to those skilled in the art by reference to the specific embodiments.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples do not represent all implementations consistent with the application. Rather, they are merely examples of apparatus and methods consistent with aspects of the application as detailed in the accompanying claims.
The recommendation system for manufacturing cloud service is generally a graph model, uses data such as historical behaviors of users, various demands and specific information of schemes as nodes (sometimes also called entities) in the graph, and uses interrelationships among the nodes as edges to construct a directed graph so as to facilitate storage. For the edges in the graph, namely the interrelationships between nodes, which can also be called as links between nodes, a part of edges can be directly established according to the self-relation between the contents corresponding to the nodes, such as the upper-lower relation between services and schemes, and the like; most of the other edges are marked according to the existing historical success schemes, such as service content corresponding to a specific requirement at a certain point.
In practical application, according to the link condition between nodes, a possible solution corresponding to the requirement or service to be provided can be determined from the requirement of the user, and a final scheme meeting the requirement of the user can be obtained through subsequent manual screening and design, so that a proper service is provided. However, since these history schemes cannot cover the relationships between all nodes, and the demands of the clients and the services provided by the providers may change with time, new demands are generated or new services are provided, so that the recommendation system needs to establish new links through an algorithm, and provide additional recommendations beyond the history schemes for the clients, that is, provide all candidate service schemes for each of the demand nodes, which is also called link prediction. To accomplish the link prediction task, all nodes and edges in the graph are typically converted into vector form, and then a relationship is calculated between the nodes according to a certain model. Wherein the relationship between entities (or nodes) is directional, the entity at the initial position of the relationship is generally called as a 'head entity', the entity at the final position is called as a 'tail entity', and the link prediction task is to actually find all potential tail entities corresponding to the relationship for a certain head entity.
In the process of link prediction task, a recommendation system can start from a requirement and provide services or solutions for clients, so that the recommendation process from the requirement to a specific service or solution can be called positive relationship recommendation. The positive relationship is the relationship actually recorded in the graph model, and the direction of the positive relationship is also the direction of the relationship itself, i.e. the potential tail entity is predicted by the head entity. However, when a user needs to be recommended for a certain service, that is, a relationship between the service and each requirement needs to be studied, the direction of the relationship is exactly opposite to the positive relationship defined above, so that the relationship can be called as an inverse relationship, and further, the inverse relationship recommendation of a specific service or scheme to the requirement is performed based on the inverse relationship. In a recommendation system for manufacturing cloud services, the inverse and positive relationships exist in conjugation, which essentially represent the same edge in the graph, but the inverse relationship recommendation is reversed to predict potential head entities from tail entities.
Currently, various manufacturing cloud service recommendation systems are generally designed for positive relations and recommended for the positive relations. For the recommendation of the inverse relationship, three existing processing methods are generally adopted: the first method considers the conjugate of the inverse, i.e. its corresponding positive relationship, traverses all other nodes, calculating whether they can establish a link of positive relationship with the target node, if so, equivalent to establishing the inverse relationship simultaneously.
Specifically, it is assumed that in the graph model, each edge and its two connected nodes may be represented by a triplet (h, r, t), where h is a beginning node of a relationship, i.e., a head entity, t is an ending node of the relationship, i.e., a tail entity, and r represents the relationship itself. In the case of positive relationship recommendation, it is required to find a tail entity t (the correct answer may be more than one, that is, the content that can be recommended may be more than one) based on the head entity h and the specified relationship r, which may be represented as a (h, r,. The inverse relationship recommendation is exactly the opposite, it finds the head entity h from the tail entity t and the relationship r, it can also be represented (. A complete triplet (h, r, t) is typically used to dig out an entity as training data.
In order to solve the positive relationship recommendation problem, the existing solution may be: a recommendation model t=f (h, r) is established, a leachable vector is allocated for each entity and relation, and the existing triples in the graph model are used as training data to train a prediction model and each entity and relation vector. The trained model can quickly calculate the vector corresponding to the tail entity after inputting the head entity and the relation, and then compares the vector with all the entities in the graph model to obtain a plurality of entities with highest matching degree as candidate tail entities. This step corresponds to recommending several cloud manufacturing services according to the user's needs.
For the problem of inverse relation recommendation, the existing method adopts the following modes: all entities in the graph except the target entity are traversed and links to other entities are constructed for each entity using the predictive model t=f (h, r), including links to the target entity. After all the entities complete the traversal, the inverse relation link from the target entity to other entities can be obtained through statistics, and the inverse relation recommendation process is completed.
However, the existing method has the following technical problems in the implementation process, and specifically comprises the following steps: 1. the time complexity is high; specifically, the method needs to traverse all entities in the graph and predict a link for each entity, so that a great deal of calculation time is required to be consumed, and the efficiency is greatly reduced. 2. The inverse relationship data is not used in training. In particular, the method only requires the use of a positive relationship recommendation (h, r,? r, t) is not used, resulting in a trained model with reduced accuracy for subsequent applications.
The second method for processing the inverse relation recommendation imitates the forward relation reasoning method, and a new model is additionally built and is specially used for the inverse relation reasoning.
Specifically, when the forward relation recommendation is performed, a recommendation model t=f (h, r) is established, and a conjugate inverse relation recommendation model t=g (h, r) is constructed according to the recommendation model, so that the forward relation recommendation is performed, namely, a head entity is obtained by means of a tail entity and a relation.
During the training process, the positive relationship recommendation model is trained with the data of the (h, r,? r, t) problems are used as training data, and a model is trained by using each of the forward and reverse relation data.
However, the second existing method has the following technical problems in the implementation process, and specifically comprises the following steps: 1. additional modeling is required. Specifically, a model t=g (h, r) is additionally constructed, and the model needs to occupy a certain storage space and needs additional training, so that the overall training efficiency of the recommended model is reduced. 2. There is a lack of relationship between the forward relationship inference model and the reverse relationship inference model. In particular, the two models should have a certain dual relationship in nature, but in reality, there is no direct connection except for the same model structure, so that the accuracy of the trained model is reduced in the subsequent application.
For the problem of recommendation of the inverse relationship, the existing method III regards the inverse relationship as a brand new positive relationship, and the inverse relationship is processed by means of processing the positive relationship.
In particular, assuming that the positive relationship conjugated to the inverse relationship can be expressed as r, we can express the inverse relationship with r -1, and the third prior art regards r -1 as a completely new positive relationship. The direction of the original inverse relationship points to the head entity from the tail entity, while the new positive relationship r -1 regards the original tail entity as the head entity and the original head entity as the tail entity, namely, new triples (t, r -1, h) are constructed at the same time, which converts the inverse relationship recommendation problem into a positive relationship recommendation problem.
During training, all newly constructed relationships and corresponding triples can be added to the training data. When the inverse relation reasoning is carried out, h=f (t, r -1) can be calculated by means of the original model, and the model is matched with each entity in the graph model, so that a plurality of entities with highest matching degree are obtained as candidates. This step is relative to locating potential customers from existing cloud manufacturing services.
However, the existing method three has the following technical problems in the implementation process, and the technical problems specifically comprise: 1. there is a lack of relationship between the forward and reverse relationships. In particular, the inverse relationship is essentially the same side of the graph as its original conjugated positive relationship, but the new positive relationship constructed from the inverse relationship is not directly related to the original conjugated positive relationship, which results in some information that may be ignored. 2. Additional space is required to store the inverse relationship. Specifically, a new positive relationship is newly constructed according to the inverse relationship, which results in doubling the total number of relationships, doubling the number of edges to be stored in the graph model, and doubling the number of triples in the training data, so that extra storage space is required, and more memory is required during training.
Aiming at the problem that reverse relation is difficult to handle when recommending potential users for specific cloud manufacturing services, the application provides a training method for manufacturing cloud service recommendation models based on graph models. Specifically, the recommendation model obtained by training the training method provided by the application links the forward and reverse relations together, converts the reverse relation vector into the function of the forward relation vector, reduces the storage space required by the model, effectively utilizes the forward and reverse relation training data in the training process, and realizes the recommendation effect of improving the reverse relation.
Fig. 1 is an application scenario diagram provided by the present application. As shown in fig. 1, the technical solution provided by the embodiment of the present application may involve the following stages:
1. In the model training stage, the inverse relation and the positive relation have conjugate property in the cloud manufacturing service recommendation system, and the inverse relation and the positive relation are basically the same side in the graph model, so that the relation between the two is required to be fully considered in training the recommendation model, the positive relation model training data and the inverse relation model training data are constructed based on the relation between the two and the sample data, the constructed recommendation model is further trained, the model training processing efficiency is improved, and the trained recommendation model has a positive relation recommendation function and an inverse relation recommendation function.
2. And the positive relation recommending stage is used for inputting information to be processed into the recommending model as user demands, and further carrying out positive relation recommending processing on the user demands to obtain target services or solutions output by the recommending model.
3. And the inverse relation recommending stage is used for recommending the service to be recommended by the information to be processed input into the recommending model, so that the service to be recommended is subjected to inverse relation recommending processing, and a target user output by the recommending model is obtained.
The following describes the technical scheme of the present application and how the technical scheme of the present application solves the above technical problems in detail with specific embodiments. The following embodiments may be combined with each other, and the same or similar concepts or processes may not be described in detail in some embodiments. Embodiments of the present application will be described below with reference to the accompanying drawings.
Fig. 2 is a schematic flow chart of a method for training a manufacturing cloud service recommendation model based on a graph model according to an embodiment of the present application. The method may be performed by a graph model-based manufacturing cloud service recommendation model training apparatus, which may be a server or an electronic device, and is described below by taking the electronic device as an example, where the method in the embodiment may be implemented by software, hardware, or a combination of software and hardware, as shown in fig. 2, and the method includes the following steps.
S201, acquiring sample data; the sample data includes forward recommendation sample data and reverse recommendation sample data that train the manufacturing cloud service recommendation system, such that the trained manufacturing cloud service recommendation system supports both forward relationship recommendation and reverse relationship recommendation.
In the embodiment of the application, data such as historical behaviors of a plurality of users, specific information of various demands and schemes are obtained, and the data are constructed into a directed graph, so that sample data for training a manufacturing cloud service recommendation system is generated.
The manufacturing cloud service recommendation system after training is completed supports positive relationship recommendation and inverse relationship recommendation, so that positive recommendation sample data and inverse recommendation sample data are required to be respectively constructed during training, and model training is carried out according to the positive recommendation sample data and the inverse recommendation sample data.
Optionally, the specific process of generating the positive recommendation sample data and the negative recommendation sample data in the present application may include: acquiring a pre-constructed graph model, and generating multiple groups of triple data based on two adjacent nodes in the graph model and edges between the two adjacent nodes; and for any group of triple data, acquiring a head entity and a positive relation in the triple data as the positive recommendation sample data, and acquiring a tail entity and a positive relation in the triple data as the negative recommendation sample data.
Wherein the pre-built graph model may include, but is not limited to, the pre-built directed graph described above. Specifically, in the directed graph, data such as historical behavior, various demands, specific information of a scheme and the like are taken as nodes in the graph, and correlations among the nodes are taken as edges. Specifically, each edge and its two connected nodes may be represented by a triplet (h, r, t), which represents (head entity, relationship, tail entity). Further, for any triplet, a complete triplet (h, r, t) may be employed to remove one of the entities as sample data.
Fig. 3 is a schematic flow chart of training data for generating a recommendation model according to an embodiment of the present application. Referring to fig. 3 for an example, the tail entities in the complete triplet may be removed, the remaining head entities and positive relations (h, r,. The head entity in the complete triplet may also be removed, the remaining tail entities and the positive relations (? t) is used as the inverse recommendation sample data, so that the subsequent training of the inverse relation recommendation function of the recommendation model is realized.
S202, obtaining an inverse relation conjugated with a positive relation in a manufacturing cloud service recommendation system, and generating a mapping relation between the positive relation and the inverse relation.
In practical application, because the positive relationship and the inverse relationship in the manufacturing cloud service recommendation system have conjugate properties, namely the positive relationship and the inverse relationship are vectors with the same edge and different directions in the directed graph, the relationship between the positive relationship and the inverse relationship needs to be fully considered when the recommendation model is trained, so that the accuracy of recommendation processing of the follow-up recommendation model is improved.
With specific continued reference to fig. 3, the relationship r in the triplet is determined as a positive relationship, and then an inverse relationship conjugated to the positive relationship, i.e., r -1, is obtained. Further, a mapping relation between a positive relation and an inverse relation is constructed based on a conjugate relation between the two, so that data conversion is conveniently carried out on the inverse relation in the sample data based on the mapping relation, further, two recommendation functions are realized based on single model learning, and further, model training efficiency is improved.
Specifically, the mapping relationship between the constructed positive relationship and the constructed inverse relationship can be expressed as:
r-1-d(r)=0;
Wherein d (r) represents a mapping function, and r -1 represents an inverse relationship; r represents a positive relationship. The positive and negative relations in the above expression construct a map describing the mapping between the negative and positive relations in the form of a explicit function, i.e. r -1 =d (r).
In practical application, the mapping function plays a role in mapping the positive relation vector, namely the mapping function is a function which can be learned, has relatively less parameter quantity, and realizes the reduction of the storage occupation space in the recommendation model, thereby realizing the improvement of the processing efficiency.
Optionally, in the embodiment of the present application, a mapping relationship between the inverse relationship and the positive relationship may be expressed by using multiple mapping manners, and correspondingly, different specific mapping functions are generated according to different mapping manners.
Alternatively, the mapping manner may include linear model mapping; correspondingly, the mapping function includes:
d(r)=Wr+b
wherein d (r) represents an inverse relationship; r represents a positive relationship; w represents the mapping weight; b represents the bias of the linear layer. Optionally, the mapping manner may further include multi-layer perceptron mapping; correspondingly, the mapping function includes:
d(r)=MLP(r);
Wherein d (r) represents an inverse relationship; r represents a positive relationship; MLP () represents a model structure.
Optionally, the mapping manner may further include convolution model mapping; correspondingly, the mapping function includes:
d(r)=Conv(r);
wherein d (r) represents an inverse relationship; r represents a positive relationship; conv () represents the model structure.
Optionally, the mapping manner may further include attention model mapping; correspondingly, the mapping function includes:
d(r)=α(r)⊙r;
wherein d (r) represents an inverse relationship; r represents a positive relationship; alpha (r) represents an attention weight; the ". Iy represents Hadamard product.
It should be noted that, the attention model extracts more important parts of the input parameters, α (r) can calculate the importance of each part according to the input relation vector and reflect the importance on the relation vector in the form of product, while it requires the multiplied vector or matrix to have the same dimension, it multiplies the elements at the same position of the left and right vectors, and the obtained result reflects on the same position of the output, so the dimension of the output vector is identical to the input.
It should be further noted that, in the embodiment of the present application, the mapping function defines the inverse relation r -1 as a relation, that is, a relation from the tail entity t to the head entity h, and establishes a direct connection between the forward and inverse relations, which means that the recommendation model based on the sub-training supports two recommendation functions at the same time, and improves the accuracy of the recommendation result; on this basis, the mapping function expresses the inverse relation into a mapping form r -1 =d (r) of a positive relation, which means that the inverse relation can be expressed by the positive relation, and the storage space of the model is reduced.
S203, carrying out data mapping processing on the inverse recommendation sample data according to the mapping relation, and generating inverse recommendation model training data corresponding to the inverse recommendation sample data according to the sample data after the mapping processing and a recommendation model to be trained.
In order to reduce the number of model parameters and improve the model multiplexing rate, the same recommendation model can be adopted to train the forward relation recommendation and the inverse relation recommendation in practical application, namely, a group of learning parameters are shared for the forward relation recommendation and the inverse relation recommendation. In order to realize that training data of two relations are input into the same recommendation model for training, data processing is needed to be carried out on the inverse recommendation sample data in advance so that the processed data are consistent with the positive recommendation sample data, namely, the two data can be input into the recommendation model to be trained at the same time for training.
In particular, since the positive recommended sample data may be expressed as (h, r,? the inverse recommended sample data may be expressed as (t, r -1,? the inverse relationship in the inverse recommended sample data may be mapped as a positive relationship, i.e., according to r -1 =d (r), the (t, r -1,? the resulting mapped data is (t, d (r).
On the basis, respectively setting corresponding training labels for the positive recommendation sample data and the negative recommendation sample data, further generating respectively corresponding training data according to the sample data and the respectively corresponding training labels, inputting a recommendation model to be trained, and training to obtain a manufacturing cloud service recommendation model after training.
With specific continued reference to fig. 3, the obtained recommendation model to be trained may be f (); correspondingly, the training data of the inverse recommendation model obtained in the application comprises the following steps:
h=f(t,d(r));
Wherein f () represents the model structure of the recommendation model; h represents a head entity, namely a training label in the reverse recommendation training process; t represents a tail entity, namely an input entity in the reverse recommendation process; d (r) represents the inverse relationship, i.e., the input relationship value during the inverse recommendation training process.
It should be noted that, the above data mapping process makes it unnecessary to additionally build a new model and to additionally add too many new parameters when the recommendation model performs the inverse relation recommendation training. On the basis, when the model is trained subsequently, training data does not need to be additionally increased according to the inverse relation, so that the storage space required by the model during training is effectively reduced, and the training speed is improved.
S204, generating positive recommendation model training data corresponding to the positive recommendation sample data according to the recommendation model and the positive recommendation sample data.
In the embodiment of the present application, the obtained recommendation model to be trained may be f (); with corresponding continued reference to fig. 3, the positive recommendation model training data obtained in the present application includes:
t=f(h,r);
wherein f () represents the model structure of the recommendation model; t represents a tail entity, namely a training label in the process of recommending training; h represents a head entity, namely an input entity in the process of being recommended; r represents a positive relationship, i.e. an input relationship value in the training process is being recommended.
And S205, training the recommendation model based on the positive recommendation model training data and the inverse recommendation model training data to obtain a trained manufacturing cloud service recommendation system.
In the embodiment of the application, on the basis of acquiring the forward recommendation model training data and the reverse recommendation model training data based on the embodiment, the forward recommendation model training data and the reverse recommendation model training data are input into a recommendation model to perform model training, and training is stopped when the training stopping condition is met, so that a manufacturing cloud service recommendation model with completed training is obtained.
In practical applications, the recommended model may be any one of a rule model, a CNN (Convolutional Neural Network ) model, a GCN (Graph Convolutional Neural Network, graph convolutional neural network) model, and a transducer model, without being limited thereto.
Optionally, if the recommendation model is a rule model, in the subsequent recommendation process, the model may assume that a simple rule relationship exists between the head entity and the tail entity and the representing vector of the relationship, for example, "head entity vector+relationship vector=tail entity vector", etc., and make the vector of each entity relationship maximally conform to the rule by using a machine learning method, and perform recommendation process by means of the rule.
Optionally, if the recommendation model is a CNN model, in a subsequent recommendation process, the model may use a convolutional neural network CNN to perform convolutional calculation on the entity and the relationship, extract the features, and determine the recommendation result based on the extracted features.
Optionally, if the recommendation model is a GCN model, in a subsequent recommendation process, the model may extract the features of the graph by aggregation and propagation, and the model may form a representation vector for each node or edge, and perform recommendation calculation through the graph neural network, to obtain a recommendation result.
Optionally, if the recommendation model is a transducer model, the model can accurately understand the relationship between the entity and the relationship in the recommendation process, and further, the attention layer is used for recommendation calculation to obtain a recommendation result.
In practical application, the forward and reverse relationship recommends the common model parameters, and the forward and reverse relationship is directly related through mapping, so that the recommendation of the reverse relationship can be directly processed by means of the excellent recommendation effect of the model on the forward and reverse relationship, the reasoning accuracy is improved, and potential customers can be effectively searched according to cloud manufacturing service.
In the technical scheme, when the recommendation model is trained, positive recommendation sample data and reverse recommendation sample data are respectively acquired, so that a mapping relation between a positive relation and a reverse relation conjugated with the positive relation in a cloud service recommendation system is constructed and acquired, and further the reverse recommendation sample data are processed according to the mapping relation, so that the processed data and the positive recommendation sample data can be used for training the recommendation model together, the single recommendation model is trained by adopting the sample data, excessive parameters are not required to be additionally increased, the storage space occupation of the model is reduced, and the training efficiency of the recommendation model is improved; and further, model training data corresponding to the forward relation and the reverse relation are respectively formed based on the pre-constructed model frame and the sample data, so that the trained manufacturing cloud service recommendation system simultaneously supports the forward relation recommendation and the reverse relation recommendation, and further realizes the processing of the reverse relation recommendation on the basis of adopting a single recommendation model frame to realize the good processing effect of the forward relation recommendation, thereby improving the accuracy and universality of the model recommendation.
Fig. 4 is a schematic flow chart of a manufacturing cloud service recommendation method based on a graph model according to an embodiment of the present application. The method may be performed by a graph model-based manufacturing cloud service recommendation device, which may be a server or an electronic device, and the electronic device is taken as an example to describe the method in this embodiment, and the method may be implemented by software, hardware, or a combination of software and hardware, as shown in fig. 4, and includes the following steps.
S402, obtaining information to be processed; the information to be processed comprises user requirements or services to be recommended.
S401, inputting the user requirements or the service to be recommended into a pre-trained manufacturing cloud service recommendation system, so that the recommendation system outputs target recommendation services corresponding to the user requirements when the information to be processed is the user requirements; or outputting the target user corresponding to the service to be recommended when the information to be processed is the service to be recommended.
In the embodiment of the application, in order to realize that the trained manufacturing cloud service recommendation system simultaneously supports recommending specific services for user demands or recommending target users for services to be recommended, positive recommendation sample data and reverse recommendation sample data can be respectively obtained when a recommendation model is trained, so that a mapping relation between a positive relation and a reverse relation conjugated with the positive relation in the manufacturing cloud service recommendation system is constructed and obtained, and further, the reverse recommendation sample data is processed according to the mapping relation, so that the processed data and the positive recommendation sample data can jointly train the recommendation model, the training of a single recommendation model by adopting the sample data is realized, excessive parameters are not required to be additionally increased, the storage space occupation of the model is reduced, and the training efficiency of the recommendation model is improved; and further, model training data corresponding to the positive relationship and the inverse relationship are respectively formed based on the pre-constructed model frame and the sample data, so that the processing of the inverse relationship recommendation is further realized on the basis of adopting a single recommendation model frame to perform good processing effect on the positive relationship recommendation, and the accuracy and universality of the model recommendation are improved.
Fig. 5 is a schematic structural diagram of a manufacturing cloud service recommendation model training device based on a graph model according to an embodiment of the present application. Referring to fig. 5, the training device includes: a sample data acquisition module 51, a mapping relation generation module 52, a first training data generation module 53, a second training data generation module 54, and a training module 55; wherein,
The sample data acquisition module is used for acquiring sample data; the sample data comprise forward recommendation sample data and reverse recommendation sample data for training the manufacturing cloud service recommendation system, so that the trained manufacturing cloud service recommendation system supports forward relationship recommendation and reverse relationship recommendation simultaneously;
the mapping relation generation module is used for acquiring the inverse relation conjugated with the positive relation in the manufacturing cloud service recommendation system and generating a mapping relation between the positive relation and the inverse relation;
The first training data generation module is used for carrying out data mapping processing on the inverse recommendation sample data according to the mapping relation, and generating inverse recommendation model training data corresponding to the inverse recommendation sample data according to the sample data subjected to the mapping processing and a recommendation model to be trained;
The second training data generation module is used for generating positive recommendation model training data corresponding to the positive recommendation sample data according to the recommendation model and the positive recommendation sample data;
And the training module is used for training the recommendation model based on the positive recommendation model training data and the inverse recommendation model training data to obtain a manufacturing cloud service recommendation system after training.
Optionally, the sample data acquisition module is specifically configured to:
Acquiring a pre-constructed graph model, and generating multiple groups of triple data based on two adjacent nodes in the graph model and edges between the two adjacent nodes;
And for any group of triple data, acquiring a head entity and a positive relation in the triple data as the positive recommendation sample data, and acquiring a tail entity and a positive relation in the triple data as the negative recommendation sample data.
Optionally, the mapping relationship includes:
r-1=d(r);
Wherein r -1 represents an inverse relationship; r represents a positive relationship; d (r) represents a mapping function between the positive relationship and the inverse relationship.
Optionally, the mapping mode includes linear model mapping; correspondingly, the mapping function includes:
d(r)=Wr+b;
Wherein d (r) represents an inverse relationship; r represents a positive relationship; w represents the mapping weight; b represents the bias of the linear layer;
The mapping mode comprises multi-layer perceptron mapping; correspondingly, the mapping function includes:
d(r)=MLP(r);
Wherein d (r) represents an inverse relationship; r represents a positive relationship; MLP () represents a model structure;
The mapping mode comprises convolution model mapping; correspondingly, the mapping function includes:
d(r)=Conv(r);
wherein d (r) represents an inverse relationship; r represents a positive relationship; conv () represents the model structure;
the mapping mode comprises attention model mapping; correspondingly, the mapping function includes:
d(r)=α(r)⊙r;
wherein d (r) represents an inverse relationship; r represents a positive relationship; alpha (r) represents an attention weight; the ". Iy represents Hadamard product.
Optionally, the inverse recommendation model training data includes:
h=f(t,d(r));
Wherein f () represents the model structure of the recommendation model; h represents a head entity, namely a training label in the reverse recommendation training process; t represents a tail entity, namely an input entity in the reverse recommendation process; d (r) represents an inverse relationship, namely an input relationship value in the inverse recommendation training process;
the positive recommendation model training data includes:
t=f(h,r);
wherein f () represents the model structure of the recommendation model; t represents a tail entity, namely a training label in the process of recommending training; h represents a head entity, namely an input entity in the process of being recommended; r represents a positive relationship, i.e. an input relationship value in the training process is being recommended.
Fig. 6 is a schematic structural diagram of a manufacturing cloud service recommendation device based on a graph model according to an embodiment of the present application. Referring to fig. 6, the recommending apparatus 60 includes: an information acquisition module 61 and a recommendation processing module 62; wherein,
An information acquisition module 61 for acquiring information to be processed; the information to be processed comprises user requirements or services to be recommended;
The recommendation processing module 62 is configured to input the user requirement or the service to be recommended into a pre-trained manufacturing cloud service recommendation system, so that the recommendation system outputs a target recommendation service corresponding to the user requirement when the information to be processed is the user requirement; or outputting a target user corresponding to the service to be recommended when the information to be processed is the service to be recommended; the manufacturing cloud service recommendation system is obtained by training the manufacturing cloud service recommendation model training method based on the graph model according to any one of the embodiments.
Fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application. As shown in fig. 7, the electronic device of the present embodiment may include:
at least one processor 702; and
A memory 701 communicatively coupled to the at least one processor;
Wherein the memory 702 stores instructions executable by the at least one processor 701, the instructions being executable by the at least one processor 702 to cause the server to perform a method as in any of the embodiments described above.
Alternatively, the memory 701 may be separate or integrated with the processor 702.
The implementation principle and technical effects of the electronic device provided in this embodiment may be referred to the foregoing embodiments, and will not be described herein again.
The embodiment of the application also provides a computer readable storage medium, wherein computer executable instructions are stored in the computer readable storage medium, and when the processor executes the computer executable instructions, the method of any of the previous embodiments is realized.
Embodiments of the present application also provide a computer program product comprising a computer program which, when executed by a processor, implements the method of any of the preceding embodiments.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described device embodiments are merely illustrative, e.g., the division of modules is merely a logical function division, and there may be additional divisions of actual implementation, e.g., multiple modules may be combined or integrated into another system, or some features may be omitted or not performed.
The integrated modules, which are implemented in the form of software functional modules, may be stored in a computer readable storage medium. The software functional modules described above are stored in a storage medium and include instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or processor to perform some of the steps of the methods of the various embodiments of the application.
It should be appreciated that the Processor may be a central processing unit (Central Processing Unit, abbreviated as CPU), or may be other general purpose Processor, digital signal Processor (DIGITAL SIGNAL Processor, abbreviated as DSP), application SPECIFIC INTEGRATED Circuit (ASIC), or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in connection with the present application may be embodied directly in a hardware processor for execution, or in a combination of hardware and software modules in a processor for execution. The memory may comprise a high-speed RAM memory, and may further comprise a non-volatile memory NVM, such as at least one magnetic disk memory, and may also be a U-disk, a removable hard disk, a read-only memory, a magnetic disk or optical disk, etc.
The storage medium may be implemented by any type or combination of volatile or nonvolatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk. A storage media may be any available media that can be accessed by a general purpose or special purpose computer.
An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an Application SPECIFIC INTEGRATED Circuits (ASIC). It is also possible that the processor and the storage medium reside as discrete components in a server or master device.
Fig. 8 is a block diagram of an electronic device according to an embodiment of the present application. Referring to fig. 8, device 800 may include one or more of the following components: a processing component 802, a memory 804, a power component 806, a multimedia component 808, an audio component 810, an input/output interface 812, a sensor component 814, and a communication component 816.
The processing component 802 generally controls overall operation of the device 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 802 may include one or more processors 820 to execute instructions to perform all or part of the steps of the methods described above. Further, the processing component 802 can include one or more modules that facilitate interactions between the processing component 802 and other components. For example, the processing component 802 can include a multimedia module to facilitate interaction between the multimedia component 808 and the processing component 802.
The memory 804 is configured to store various types of data to support operations at the device 800. Examples of such data include instructions for any application or method operating on device 800, contact data, phonebook data, messages, pictures, videos, and the like. The memory 804 may be implemented by any type or combination of volatile or nonvolatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
The power supply component 806 provides power to the various components of the device 800. The power components 806 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the device 800.
The multimedia component 808 includes a screen between the device 800 and the user that provides an output interface. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or slide action, but also the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 808 includes a front camera and/or a rear camera. The front camera and/or the rear camera may receive external multimedia data when the device 800 is in an operational mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have focal length and optical zoom capabilities.
The audio component 810 is configured to output and/or input audio signals. For example, the audio component 810 includes a Microphone (MIC) configured to receive external audio signals when the device 800 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may be further stored in the memory 804 or transmitted via the communication component 816. In some embodiments, audio component 810 further includes a speaker for outputting audio signals.
The I/O interface 812 provides an interface between the processing component 802 and peripheral interface modules, which may be a keyboard, click wheel, buttons, etc. These buttons may include, but are not limited to: homepage button, volume button, start button, and lock button.
The sensor assembly 814 includes one or more sensors for providing status assessment of various aspects of the device 800. For example, the sensor assembly 814 may detect an on/off state of the device 800, a relative positioning of the components, such as a display and keypad of the device 800, the sensor assembly 814 may also detect a change in position of the device 800 or a component of the device 800, the presence or absence of user contact with the device 800, an orientation or acceleration/deceleration of the device 800, and a change in temperature of the device 800. The sensor assembly 814 may include a proximity sensor configured to detect the presence of nearby objects without any physical contact. The sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 814 may also include an acceleration sensor, a gyroscopic sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 816 is configured to facilitate communication between the device 800 and other devices, either wired or wireless. The device 800 may access a wireless network based on a communication standard, such as WiFi,2G or 3G, or a combination thereof. In one exemplary embodiment, the communication component 816 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In one exemplary embodiment, the communication component 816 further includes a Near Field Communication (NFC) module to facilitate short range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 800 may be implemented by one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic elements for executing the methods described above.
In an exemplary embodiment, a non-transitory computer readable storage medium is also provided, such as memory 804 including instructions executable by processor 820 of device 800 to perform the above-described method. For example, the non-transitory computer readable storage medium may be ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
A non-transitory computer readable storage medium, which when executed by a processor of an electronic device, causes the electronic device to perform the method described above.
Other embodiments of the application will be apparent to those skilled in the art from consideration of the specification and practice of the application disclosed herein. This application is intended to cover any variations, uses, or adaptations of the application following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the application pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
It is to be understood that the application is not limited to the precise arrangements and instrumentalities shown in the drawings, which have been described above, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the application is limited only by the appended claims.
Claims (7)
1. A graph model-based manufacturing cloud service recommendation model training method, the method comprising:
Acquiring sample data; the sample data comprise forward recommendation sample data and reverse recommendation sample data for training the manufacturing cloud service recommendation system, so that the trained manufacturing cloud service recommendation system supports forward relationship recommendation and reverse relationship recommendation simultaneously; the acquiring sample data includes: acquiring a pre-constructed graph model, and generating multiple groups of triple data based on two adjacent nodes in the graph model and edges between the two adjacent nodes; for any group of triple data, acquiring a head entity and a positive relation in the triple data as the positive recommendation sample data, and acquiring a tail entity and a positive relation in the triple data as the inverse recommendation sample data; acquiring an inverse relation conjugated with a positive relation in a manufacturing cloud service recommendation system, and generating a mapping relation between the positive relation and the inverse relation; the mapping relation comprises the following steps: r -1 = d (r); wherein r -1 represents an inverse relationship; r represents a positive relationship; d (r) represents a mapping function between the positive relationship and the inverse relationship;
Performing data mapping processing on the inverse recommendation sample data according to the mapping relation, and generating inverse recommendation model training data corresponding to the inverse recommendation sample data according to the mapped sample data and a recommendation model to be trained; the inverse recommendation model training data comprises: h=f (t, d (r)); wherein f () represents the model structure of the recommendation model; h represents a head entity, namely a training label in the reverse recommendation training process; t represents a tail entity, namely an input entity in the reverse recommendation process; d (r) represents an inverse relationship, namely an input relationship value in the inverse recommendation training process;
Generating positive recommendation model training data corresponding to the positive recommendation sample data according to the recommendation model and the positive recommendation sample data; the positive recommendation model training data includes: t=f (h, r); wherein f () represents the model structure of the recommendation model; t represents a tail entity, namely a training label in the process of recommending training; h represents a head entity, namely an input entity in the process of being recommended; r represents a positive relationship, namely an input relationship value in the process of recommending training;
and training the recommendation model based on the forward recommendation model training data and the reverse recommendation model training data to obtain a trained manufacturing cloud service recommendation system.
2. The graph model-based manufacturing cloud service recommendation model training method according to claim 1, wherein the mapping mode comprises linear model mapping; correspondingly, the mapping function includes:
d(r)=Wr+b;
Wherein d (r) represents an inverse relationship; r represents a positive relationship; w represents the mapping weight; b represents the bias of the linear layer;
The mapping mode comprises multi-layer perceptron mapping; correspondingly, the mapping function includes:
d(r)=MLP(r);
Wherein d (r) represents an inverse relationship; r represents a positive relationship; MLP () represents a model structure;
The mapping mode comprises convolution model mapping; correspondingly, the mapping function includes:
d(r)=Conv(r);
wherein d (r) represents an inverse relationship; r represents a positive relationship; conv () represents the model structure;
the mapping mode comprises attention model mapping; correspondingly, the mapping function includes:
d(r)=α(r)☉r;
wherein d (r) represents an inverse relationship; r represents a positive relationship; alpha (r) represents an attention weight; the ". Iy represents Hadamard product.
3. A graph model-based manufacturing cloud service recommendation method, the method comprising:
Acquiring information to be processed; the information to be processed comprises user requirements or services to be recommended;
Inputting the user demand or the service to be recommended into a pre-trained manufacturing cloud service recommendation system, so that the recommendation system outputs a target recommendation service corresponding to the user demand when the information to be processed is the user demand; or outputting a target user corresponding to the service to be recommended when the information to be processed is the service to be recommended; the manufacturing cloud service recommendation system is trained by the graph model-based manufacturing cloud service recommendation model training method according to claim 1 or 2.
4. A manufacturing cloud service recommendation model training apparatus based on a graph model, the apparatus comprising:
The sample data acquisition module is used for acquiring sample data; the sample data comprise forward recommendation sample data and reverse recommendation sample data for training the manufacturing cloud service recommendation system, so that the trained manufacturing cloud service recommendation system supports forward relationship recommendation and reverse relationship recommendation simultaneously; the sample data acquisition module is specifically configured to: acquiring a pre-constructed graph model, and generating multiple groups of triple data based on two adjacent nodes in the graph model and edges between the two adjacent nodes; for any group of triple data, acquiring a head entity and a positive relation in the triple data as the positive recommendation sample data, and acquiring a tail entity and a positive relation in the triple data as the inverse recommendation sample data;
the mapping relation generation module is used for acquiring an inverse relation conjugated with the positive relation in the manufacturing cloud service recommendation system and generating a mapping relation between the positive relation and the inverse relation; the mapping relation comprises the following steps: r -1 = d (r); wherein r -1 represents an inverse relationship; r represents a positive relationship; d (r) represents a mapping function between the positive relationship and the inverse relationship;
The first training data generation module is used for carrying out data mapping processing on the inverse recommendation sample data according to the mapping relation, and generating inverse recommendation model training data corresponding to the inverse recommendation sample data according to the sample data subjected to the mapping processing and a recommendation model to be trained; the inverse recommendation model training data comprises: h=f (t, d (r)); wherein f () represents the model structure of the recommendation model; h represents a head entity, namely a training label in the reverse recommendation training process; t represents a tail entity, namely an input entity in the reverse recommendation process; d (r) represents an inverse relationship, namely an input relationship value in the inverse recommendation training process;
The second training data generation module is used for generating positive recommendation model training data corresponding to the positive recommendation sample data according to the recommendation model and the positive recommendation sample data; the positive recommendation model training data includes: t=f (h, r); wherein f () represents the model structure of the recommendation model; t represents a tail entity, namely a training label in the process of recommending training; h represents a head entity, namely an input entity in the process of being recommended; r represents a positive relationship, namely an input relationship value in the process of recommending training;
And the training module is used for training the recommendation model based on the positive recommendation model training data and the inverse recommendation model training data to obtain a manufacturing cloud service recommendation system after training.
5. A manufacturing cloud service recommendation device based on a graph model, the device comprising:
the information acquisition module is used for acquiring information to be processed; the information to be processed comprises user requirements or services to be recommended;
the recommendation processing module is used for inputting the user demands or the to-be-recommended services into a pre-trained manufacturing cloud service recommendation system, so that the recommendation system outputs target recommendation services corresponding to the user demands when the to-be-processed information is the user demands; or outputting a target user corresponding to the service to be recommended when the information to be processed is the service to be recommended; the manufacturing cloud service recommendation system is trained by the graph model-based manufacturing cloud service recommendation model training method according to claim 1 or 2.
6. An electronic device, comprising: a processor and a memory communicatively coupled to the processor;
the memory stores computer-executable instructions;
The processor, when executing the computer-executable instructions, is configured to implement the graph model-based manufacturing cloud service recommendation model training method of claim 1 or 2, or the graph model-based manufacturing cloud service recommendation method of claim 3.
7. A computer readable storage medium, wherein computer executable instructions are stored in the computer readable storage medium, the computer executable instructions when executed by a processor are configured to implement the graph model-based manufacturing cloud service recommendation model training method of claim 1 or 2, or the graph model-based manufacturing cloud service recommendation method of claim 3.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410200406.8A CN118133010B (en) | 2024-02-23 | 2024-02-23 | Graph model-based manufacturing cloud service recommendation model training method and recommendation method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410200406.8A CN118133010B (en) | 2024-02-23 | 2024-02-23 | Graph model-based manufacturing cloud service recommendation model training method and recommendation method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN118133010A CN118133010A (en) | 2024-06-04 |
CN118133010B true CN118133010B (en) | 2024-10-08 |
Family
ID=91243607
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202410200406.8A Active CN118133010B (en) | 2024-02-23 | 2024-02-23 | Graph model-based manufacturing cloud service recommendation model training method and recommendation method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN118133010B (en) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113190725A (en) * | 2021-03-31 | 2021-07-30 | 北京达佳互联信息技术有限公司 | Object recommendation and model training method and device, equipment, medium and product |
CN117349513A (en) * | 2023-09-07 | 2024-01-05 | 杭州阿里云飞天信息技术有限公司 | Recommended model training method |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111274472A (en) * | 2018-12-04 | 2020-06-12 | 北京嘀嘀无限科技发展有限公司 | Information recommendation method and device, server and readable storage medium |
CN114462502B (en) * | 2022-01-06 | 2024-07-12 | 支付宝(杭州)信息技术有限公司 | Nuclear body recommendation model training method and device |
KR102612986B1 (en) * | 2022-10-19 | 2023-12-12 | 한국과학기술원 | Online recomending system, method and apparatus for updating recommender based on meta-leaining |
-
2024
- 2024-02-23 CN CN202410200406.8A patent/CN118133010B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113190725A (en) * | 2021-03-31 | 2021-07-30 | 北京达佳互联信息技术有限公司 | Object recommendation and model training method and device, equipment, medium and product |
CN117349513A (en) * | 2023-09-07 | 2024-01-05 | 杭州阿里云飞天信息技术有限公司 | Recommended model training method |
Also Published As
Publication number | Publication date |
---|---|
CN118133010A (en) | 2024-06-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106651955B (en) | Method and device for positioning target object in picture | |
CN109859096A (en) | Image Style Transfer method, apparatus, electronic equipment and storage medium | |
JP7096888B2 (en) | Network modules, allocation methods and devices, electronic devices and storage media | |
CN112287994B (en) | Pseudo tag processing method, pseudo tag processing device, pseudo tag processing equipment and computer readable storage medium | |
CN109670077B (en) | Video recommendation method and device and computer-readable storage medium | |
CN111695682B (en) | Data processing method and device | |
CN109670632B (en) | Advertisement click rate estimation method, advertisement click rate estimation device, electronic device and storage medium | |
CN109165738B (en) | Neural network model optimization method and device, electronic device and storage medium | |
US20240320807A1 (en) | Image processing method and apparatus, device, and storage medium | |
CN114049529A (en) | User behavior prediction method, model training method, electronic device, and storage medium | |
CN111695686B (en) | Address allocation method and device | |
CN112783779A (en) | Test case generation method and device, electronic equipment and storage medium | |
CN112766498B (en) | Model training method and device | |
CN113553448B (en) | Recommendation model training method and device, electronic equipment and storage medium | |
CN118133010B (en) | Graph model-based manufacturing cloud service recommendation model training method and recommendation method | |
CN111694768B (en) | Operation method, device and related product | |
US11010935B2 (en) | Context aware dynamic image augmentation | |
CN113190725B (en) | Object recommendation and model training method and device, equipment, medium and product | |
CN116341976A (en) | Index data adjustment method, device and equipment for navigation map production line | |
CN113486978B (en) | Training method and device for text classification model, electronic equipment and storage medium | |
CN114066098B (en) | Method and equipment for estimating completion time of learning task | |
CN112929751B (en) | System, method and terminal for determining action execution | |
CN114186535A (en) | Structure diagram reduction method, device, electronic equipment, medium and program product | |
CN113807540A (en) | Data processing method and device | |
CN111984864A (en) | Object recommendation method and device, electronic equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |