CN110059498B - Privacy control automatic setting method and system for social network - Google Patents
Privacy control automatic setting method and system for social network Download PDFInfo
- Publication number
- CN110059498B CN110059498B CN201910216242.7A CN201910216242A CN110059498B CN 110059498 B CN110059498 B CN 110059498B CN 201910216242 A CN201910216242 A CN 201910216242A CN 110059498 B CN110059498 B CN 110059498B
- Authority
- CN
- China
- Prior art keywords
- privacy
- user
- personal
- text information
- attribute
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 16
- 230000035945 sensitivity Effects 0.000 claims abstract description 27
- 238000007619 statistical method Methods 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 238000004891 communication Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 230000001568 sexual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/60—Protecting data
- G06F21/62—Protecting access to data via a platform, e.g. using keys or access control rules
- G06F21/6218—Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
- G06F21/6245—Protecting personal data, e.g. for financial or medical purposes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/01—Social networking
Landscapes
- Engineering & Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- Bioethics (AREA)
- Business, Economics & Management (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- Marketing (AREA)
- General Engineering & Computer Science (AREA)
- Computer Security & Cryptography (AREA)
- Databases & Information Systems (AREA)
- Computing Systems (AREA)
- Medical Informatics (AREA)
- Economics (AREA)
- Human Resources & Organizations (AREA)
- Software Systems (AREA)
- Primary Health Care (AREA)
- Strategic Management (AREA)
- Tourism & Hospitality (AREA)
- General Business, Economics & Management (AREA)
- Information Transfer Between Computers (AREA)
- Storage Device Security (AREA)
- Telephonic Communication Services (AREA)
Abstract
The invention provides a privacy control automatic setting method and a privacy control automatic setting system for a social network, which comprise the following steps: according to the visibility of each user attribute in the personal file of the user to be controlled in privacy, counting the personal public opening of the user to be controlled in privacy; acquiring a privacy attribute set consisting of a plurality of privacy attributes, acquiring text information to be issued of a user to be controlled in privacy, inputting the text information to a plurality of classifiers, wherein each classifier corresponds to one privacy attribute, the classifiers output probability distribution of the text information on values corresponding to the privacy attributes, and normalizing the entropy of the probability distribution to obtain the proper public aperture of the text information; obtaining the personal openness of each friend of the user to be controlled in privacy according to the personal profile and the release content of the friend of the user to be controlled in privacy; and measuring the privacy sensitivity of the text information to each friend according to the personal public degree, the suitable public degree and the personal public degree of the friends of the user, and determining the public range of the text information according to the privacy sensitivity.
Description
Technical Field
The invention relates to privacy protection of a social network, in particular to a method and a system for automatically setting privacy control of a social network according to user information.
Background
The rapid development of Online Social Networks (OSNs) in recent years has facilitated person-to-person communication and accelerated information dissemination. Meanwhile, the number of interactions among users is increased, and improper privacy setting can cause personal information to be diffused to an unforeseen range, so that the privacy of the individual is threatened, and a plurality of privacy-related problems are generated.
While most OSNs, such as microblogs, Twitter, various types of forums, etc., provide privacy settings for attributes in a personal profile, such as all visible, grouped visible, only self visible, etc., such settings are typically default to all visible, and the location of most privacy settings is not obvious. In addition, when a user publishes information, the default privacy settings are also visible to all, and cannot be automatically adjusted according to the published content of the user, so that the user can unconsciously reveal own privacy.
Disclosure of Invention
Aiming at the defects of the prior art, the invention provides an automatic privacy control setting method for a social network, which comprises a measuring method for measuring the privacy protection degree of attributes in a user personal file, the degree of privacy-related information contained in user release content and the privacy protection degree of friends of the user, and provides the automatic privacy control setting method for the user release content by combining the measuring values of the three.
Specifically, the invention provides a privacy control automatic setting method for a social network, which comprises the following steps:
step 1, according to the visibility of each user attribute in the personal file of a user to be subjected to privacy control, counting the personal public opening degree of the user to be subjected to privacy control;
step 2, acquiring a privacy attribute set consisting of a plurality of privacy attributes, acquiring text information to be issued by a user to be controlled in privacy, inputting the text information to a plurality of classifiers, wherein each classifier corresponds to one privacy attribute, the classifiers output probability distribution of the text information on values corresponding to the privacy attributes, and the entropy of the probability distribution is normalized to obtain the suitable public aperture of the text information;
step 3, obtaining the personal openness of each friend of the user to be controlled in privacy according to the personal profile and the release content of the friend of the user to be controlled in privacy;
and 4, measuring the privacy sensitivity of the text information to each friend according to the personal public degree of the user, the suitable public degree and the personal public degree of the friend, and determining the disclosure range of the text information according to the privacy sensitivity.
The privacy control automatic setting method for the social network is characterized in that the specific statistical mode of the personal public degree of the user in the step 1 is as follows:
wherein U is the personal public opening of the user, wuiRepresents the weight value corresponding to the user attribute i, and sigmaiwui=1,viIndicating the visibility of the attribute i.
The privacy control automatic setting method for the social network, wherein the determination mode of the suitable disclosure degree in the step 2 is as follows:
wherein C is the proper opening degree, pijAs privacy attributes SiProbability distribution over values, niRepresenting privacy attributes SiThe number of values, wci, represents the privacy attribute SiAnd satisfies Σiwci=1。
The privacy control automatic setting method for the social network is characterized in that the determination mode of the personal disclosure degree of the friend i in the step 3 is as follows:
wherein, UiPersonal public opening, sigma, for the user of the friend ijCj/ncUsed for measuring the degree of disclosure of privacy sensitive information in the content issued by friend i, ncTotal number of contents released for friend i, CjIndicating the privacy sensitivity of the jth text published by the buddy.
The privacy control automatic setting method for the social network is characterized in that the privacy sensitivity measuring mode of the friend i in the step 4 is as follows:
Mi=U*C*Fi
wherein M isiPrivacy sensitivity for friend i.
The invention also provides a privacy control automatic setting system for the social network, which comprises the following steps:
the module 1 is used for counting the personal public degree of a user to be subjected to privacy control according to the visibility of each user attribute in the personal file of the user to be subjected to privacy control;
the module 2 acquires a privacy attribute set consisting of a plurality of privacy attributes, acquires text information to be issued by a user to be controlled in privacy, inputs the text information to a plurality of classifiers, each classifier corresponds to one privacy attribute, the classifiers output probability distribution of the text information on values corresponding to the privacy attributes, and normalizes entropy of the probability distribution to obtain suitable public aperture of the text information;
the module 3 is used for obtaining the personal openness of each friend of the user to be controlled in privacy according to the personal profile and the release content of the friend of the user to be controlled in privacy;
and the module 4 measures the privacy sensitivity of the text information to each friend according to the personal public degree of the user, the suitable public degree and the personal public degree of the friend, and determines the disclosure range of the text information according to the privacy sensitivity.
The privacy control automatic setting system for the social network is characterized in that the specific statistical mode of the personal public degree of the user in the module 1 is as follows:
wherein U is the personal public opening of the user, wuiRepresents the weight value corresponding to the user attribute i, and sigmaiwui=1,viIndicating the visibility of the attribute i.
The privacy control automatic setting system for the social network, wherein the determination mode of the suitable disclosure degree in the module 2 is as follows:
wherein C is the proper opening degree, pijAs privacy attributes SiProbability distribution over values, niRepresenting privacy attributes SiThe number of values, wci, represents the privacy attribute SiAnd satisfies Σiwci=1。
The privacy control automatic setting system for the social network, wherein the determining mode of the personal disclosure degree of the friend i in the module 3 is as follows:
wherein, UiPersonal public opening, sigma, for the user of the friend ijCj/ncUsed for measuring the degree of disclosure of privacy sensitive information in the content issued by friend i, ncTotal number of contents released for friend i, CjIndicating the privacy sensitivity of the jth text published by the buddy.
The privacy control automatic setting system for the social network is characterized in that the privacy sensitivity measuring mode of the friend i in the module 4 is as follows:
Mi=U*C*Fi
wherein M isiPrivacy sensitivity for friend i.
From the above solution, the present invention includes: based on the analysis of the user, the default visible range of each attribute in the personal profile page of the user and the content published by the user are set, and the default visible range of the content to be published is set according to the privacy preference and the friend privacy protection degree of the user. The invention has the advantages that: when a user publishes content, an automatic privacy control setting method is provided, and a sharing strategy for protecting the privacy of the user is provided for the content publishing at this time according to the personal privacy preference, the published content and the privacy protection degree of friends of the published content. Compared with the prior art, the privacy protection strategy is more flexible, and the user does not need to manually set the privacy for each sharing.
Drawings
FIG. 1 is a flow chart of the present invention.
Detailed Description
The invention aims to provide a strategy for automatically setting privacy control, and the privacy leakage problem caused by information sharing is relieved.
The invention divides the privacy control of the user into two parts: privacy control of attributes in a user's personal profile and privacy control of the user's published content. According to the invention, default settings which can protect privacy better are provided for the two types of privacy control according to the user information.
First, privacy control of attributes in a user's personal profile.
Aiming at a platform with specific application, two types of statistical methods are adopted according to the number of active users of the platform. If the platform has enough active users, the users are directly subjected to statistical analysis; otherwise, statistical analysis is performed by means of user data of other large OSNs (such as microblog, Twitter and the like). The active user definition is different according to different platforms, for example, for microblogs, users with a total number of microblogs larger than 30, account registration time longer than one month, and ratio of original microblogs to forwarded microblogs within a certain range can be defined as active users.
Analyzing the general view of the privacy sensitivity of the user to various attributes in the user file, and performing statistical analysis on each attribute (name, age, address, marital status, and the like). If the attribute has more than a preset number or proportion of users (e.g., half of the users) set to be invisible, the default privacy setting for the attribute is set to be visible only by itself.
The user can modify the visibility of the attributes in the personal profile of the user by himself, and the definition U is the disclosure degree of the information in the personal profile of the user and is used for the privacy setting of the subsequent user issued content. The calculation is as follows:
wherein, wuiRepresenting the weight value corresponding to the user attribute i and satisfying sigmaiwui=1,viIndicates the visibility of the attribute i, when all people are visible viWhen a packet is visible, v ═ 1i0.5, when only visible to oneself vi0. Without loss of generality, here the attribute weights wuiAnd (4) evenly distributing.
Second, as shown in FIG. 1, the user publishes privacy controls for the content.
1. And calculating the degree of the privacy-related information contained in the user release content.
The invention only considers the privacy problem of the text information. Defining all privacy-sensitive attribute categories as a set S, and setting the set according to privacy terms of each large company, for example: a health condition; political, religious beliefs; age; sexual orientation; sex; disability; economic conditions, etc.
For text content, training multiple groups of classifiers based on a machine learning method, wherein each classifier corresponds to a privacy-sensitive attribute SiOutput vector piEach dimension of the vector represents a piece of text in an attribute SiProbability distribution p over each possible value jij. Calculating piNormalization to obtain Ei. For example: sensitivity attribute SiSuch as S1For 'age group', possible values are<20;20-40;>Three kinds of 40, ni3, vector pi=[0.3,0.4,0.3]The probabilities corresponding to the three classes, respectively, and finally the entropy of the vector pi, n, is calculatediRepresents an attribute SiThe number of all possible values, then EiThe calculation is as follows:
Eithe smaller the size, the more the text is in the attribute SiThe lower the uncertainty, the easier it is to reveal the privacy of the attribute, whereas the weaker the representation is in relation to the attribute. Thus using EiMeasure the text to sensitivity attribute SiTo the extent appropriate for disclosure.
E for all sensitive attributesiWeighting to obtain C:
wherein wci represents the weight of each attribute, and satisfies Σiwci1. C represents the comprehensive consideration of various sensitive attributes, and the text is suitable for the disclosure degree. The closer C is to 1, the more irrelevant the content is to the privacy sensitive attributes, and the more suitable it is for disclosure from the privacy protection perspective. Without loss of generality, here the class weights are equally distributed.
2. And measuring the privacy protection degree of friends of the user.
The degree of privacy protection of a user's friend is determined by the friend's user profile and the published content together to measure the privacy awareness of the user's friend i, i.e., the degree of reliability of the disclosure of the content to be published to the friend, using FiExpressed, the calculation is as follows:
wherein, UiIs the friend's U value, indicating the visibility of his personal profile, ΣjCj/ncFor measuring the degree of disclosure of privacy sensitive information in the content distributed by the user, ncTotal number of contents distributed thereto, CjAnd the privacy sensitivity of the jth text published by the friend is represented, and the average sensitivity of all published contents of the friend is measured after the privacy sensitivity is added and averaged.
FiThe higher the indication of the friend's privacyThe more conscious, the less privacy risk the published content is visible to the buddy.
3. The default visibility range for this information is set in conjunction with the user's past preferences.
Combining the metric values, calculating the metric value M of the contents to be released to the friend ii
Mi=U*C*Fi
Setting a threshold value T when Mi>And T, setting the content to be published to be visible to the friend. Namely, when the user has weak privacy awareness (U is larger), the relation between text content and privacy sensitive attribute is smaller (C is larger), and the friend has strong privacy awareness (F)iLarger), the content that the user wants to publish is more likely to be disclosed to friend i.
The following are system examples corresponding to the above method examples, and this embodiment can be implemented in cooperation with the above embodiments. The related technical details mentioned in the above embodiments are still valid in this embodiment, and are not described herein again in order to reduce repetition. Accordingly, the related-art details mentioned in the present embodiment can also be applied to the above-described embodiments.
The invention also provides a privacy control automatic setting system for the social network, which comprises the following steps:
the module 1 is used for counting the personal public degree of a user to be subjected to privacy control according to the visibility of each user attribute in the personal file of the user to be subjected to privacy control;
the module 2 acquires a privacy attribute set consisting of a plurality of privacy attributes, acquires text information to be issued by a user to be controlled in privacy, inputs the text information to a plurality of classifiers, each classifier corresponds to one privacy attribute, the classifiers output probability distribution of the text information on values corresponding to the privacy attributes, and normalizes entropy of the probability distribution to obtain suitable public aperture of the text information;
the module 3 is used for obtaining the personal openness of each friend of the user to be controlled in privacy according to the personal profile and the release content of the friend of the user to be controlled in privacy;
and the module 4 measures the privacy sensitivity of the text information to each friend according to the personal public degree of the user, the suitable public degree and the personal public degree of the friend, and determines the disclosure range of the text information according to the privacy sensitivity.
The privacy control automatic setting system for the social network is characterized in that the specific statistical mode of the personal public degree of the user in the module 1 is as follows:
wherein U is the personal public opening of the user, wuiRepresents the weight value corresponding to the user attribute i, and sigmaiwui=1,viIndicating the visibility of the attribute i.
The privacy control automatic setting system for the social network, wherein the determination mode of the suitable disclosure degree in the module 2 is as follows:
wherein C is the proper opening degree, pijAs privacy attributes SiProbability distribution over values, niRepresenting privacy attributes SiThe number of values, wci, represents the privacy attribute SiAnd satisfies Σiwci=1。
The privacy control automatic setting system for the social network, wherein the determining mode of the personal disclosure degree of the friend i in the module 3 is as follows:
wherein, UiPersonal public opening, sigma, for the user of the friend ijCj/ncUsed for measuring the degree of disclosure of privacy sensitive information in the content issued by friend i, ncTotal number of contents released for friend i, CjIndicating the privacy sensitivity of the jth text published by the buddy.
The privacy control automatic setting system for the social network is characterized in that the privacy sensitivity measuring mode of the friend i in the module 4 is as follows:
Mi=U*C*Fi
wherein M isiPrivacy sensitivity for friend i.
Claims (4)
1. A privacy control automatic setting method for a social network is characterized by comprising the following steps:
step 1, according to the visibility of each user attribute in the personal file of a user to be subjected to privacy control, counting the personal public opening degree of the user to be subjected to privacy control;
step 2, acquiring a privacy attribute set consisting of a plurality of privacy attributes, acquiring text information to be issued by a user to be controlled in privacy, inputting the text information to a plurality of classifiers, wherein each classifier corresponds to one privacy attribute, the classifiers output probability distribution of the text information on values corresponding to the privacy attributes, and the entropy of the probability distribution is normalized to obtain the suitable public aperture of the text information;
step 3, obtaining the personal openness of each friend of the user to be controlled in privacy according to the personal profile and the release content of the friend of the user to be controlled in privacy;
and 4, measuring the privacy sensitivity of the text information to each friend according to the personal public degree of the user, the suitable public degree and the personal public degree of the friend, and determining the disclosure range of the text information according to the privacy sensitivity.
2. The method according to claim 1, wherein the specific statistical manner of the personal public degree of the user in step 1 is as follows:
wherein U is the personal public opening of the user, wuiRepresents the weight value corresponding to the user attribute i, and sigmaiwui=1,viIndicating the visibility of the ith attribute, v when the ith attribute is visible to alliWhen a packet is visible, v ═ 1i0.5, when only visible to oneself vi=0。
3. A privacy control automatic setting system for a social network, comprising:
the module 1 is used for counting the personal public degree of a user to be subjected to privacy control according to the visibility of each user attribute in the personal file of the user to be subjected to privacy control;
the module 2 acquires a privacy attribute set consisting of a plurality of privacy attributes, acquires text information to be issued by a user to be controlled in privacy, inputs the text information to a plurality of classifiers, each classifier corresponds to one privacy attribute, the classifiers output probability distribution of the text information on values corresponding to the privacy attributes, and normalizes entropy of the probability distribution to obtain suitable public aperture of the text information;
the module 3 is used for obtaining the personal openness of each friend of the user to be controlled in privacy according to the personal profile and the release content of the friend of the user to be controlled in privacy;
and the module 4 measures the privacy sensitivity of the text information to each friend according to the personal public degree of the user, the suitable public degree and the personal public degree of the friend, and determines the disclosure range of the text information according to the privacy sensitivity.
4. The privacy-controlled automatic setting system for social networks according to claim 3, wherein the specific statistical manner of the personal public degree of the user in the module 1 is as follows:
wherein U is the personal public opening of the user, wuiRepresents the weight value corresponding to the user attribute i, and sigmaiwui=1,viIndicating the visibility of the ith attribute, v when the ith attribute is visible to alliWhen a packet is visible, v ═ 1i0.5, when only visible to oneself vi=0。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910216242.7A CN110059498B (en) | 2019-03-21 | 2019-03-21 | Privacy control automatic setting method and system for social network |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910216242.7A CN110059498B (en) | 2019-03-21 | 2019-03-21 | Privacy control automatic setting method and system for social network |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110059498A CN110059498A (en) | 2019-07-26 |
CN110059498B true CN110059498B (en) | 2021-07-23 |
Family
ID=67317250
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910216242.7A Active CN110059498B (en) | 2019-03-21 | 2019-03-21 | Privacy control automatic setting method and system for social network |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110059498B (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101655895A (en) * | 2008-08-22 | 2010-02-24 | 株式会社日立制作所 | Content control system |
CN103268454A (en) * | 2012-12-18 | 2013-08-28 | 北京奇虎科技有限公司 | Display control method and system for data opened by user |
CN104981816A (en) * | 2013-02-19 | 2015-10-14 | 索尼电脑娱乐公司 | Information processing system |
CN105574434A (en) * | 2015-12-14 | 2016-05-11 | 网易(杭州)网络有限公司 | Information shielding method and device |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120109836A1 (en) * | 2010-11-01 | 2012-05-03 | Google Inc. | Content sharing interface for sharing content in social networks |
US9781540B2 (en) * | 2011-07-07 | 2017-10-03 | Qualcomm Incorporated | Application relevance determination based on social context |
CN109040439B (en) * | 2013-08-14 | 2021-01-12 | 华为终端有限公司 | Method and device for realizing privacy protection |
CN105262674B (en) * | 2015-10-29 | 2018-09-25 | 小米科技有限责任公司 | Method, apparatus, server and terminal for privacy authority to be arranged |
CN106504101A (en) * | 2016-10-11 | 2017-03-15 | 北京小米移动软件有限公司 | The display control method for releasing news of social networking application and device |
CN106528709A (en) * | 2016-10-26 | 2017-03-22 | 北京小米移动软件有限公司 | Social information recommendation method and apparatus |
CN107665442B (en) * | 2017-05-10 | 2020-03-27 | 平安科技(深圳)有限公司 | Method and device for acquiring target user |
-
2019
- 2019-03-21 CN CN201910216242.7A patent/CN110059498B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101655895A (en) * | 2008-08-22 | 2010-02-24 | 株式会社日立制作所 | Content control system |
CN103268454A (en) * | 2012-12-18 | 2013-08-28 | 北京奇虎科技有限公司 | Display control method and system for data opened by user |
CN104981816A (en) * | 2013-02-19 | 2015-10-14 | 索尼电脑娱乐公司 | Information processing system |
CN105574434A (en) * | 2015-12-14 | 2016-05-11 | 网易(杭州)网络有限公司 | Information shielding method and device |
Also Published As
Publication number | Publication date |
---|---|
CN110059498A (en) | 2019-07-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR102429416B1 (en) | Anti-Cyber Bulling System and Method | |
US11023823B2 (en) | Evaluating content for compliance with a content policy enforced by an online system using a machine learning model determining compliance with another content policy | |
US8145708B2 (en) | On-line virtual robot (bot) security agent | |
US20080059198A1 (en) | Apparatus and method for detecting and reporting online predators | |
US9805128B2 (en) | Methods and systems for predicting psychological types | |
CN109074553A (en) | It is handled using the spam of continuous model training | |
US10169476B2 (en) | Method, apparatus, and computer-readable storage medium for grouping social network nodes | |
US20170351733A1 (en) | User address match based on match quality | |
US8275656B2 (en) | Maximum likelihood estimation under a covariance constraint for predictive modeling | |
JP2014219972A (en) | System and method for detecting quitting intention on the basis of electronic-communication dynamics | |
KR101886628B1 (en) | System and clustering method for group chatrooms, and service thereof | |
CN104484359B (en) | A kind of the analysis of public opinion method and device based on social graph | |
Agdas et al. | Wind speed perception and risk | |
Amintoosi et al. | A trust framework for social participatory sensing systems | |
JP2023515067A (en) | Method and apparatus for interactive and privacy-preserving communication between servers and user devices | |
KR101811751B1 (en) | Advertisement providing server using chatbot | |
US20210234823A1 (en) | Detecting and identifying toxic and offensive social interactions in digital communications | |
US20130117296A1 (en) | Communication assistance device, communication assistance method, and computer readable recording medium | |
CN110059498B (en) | Privacy control automatic setting method and system for social network | |
US10310698B2 (en) | Information processing system, information processing method, information processing device, information processing terminal, for dynamically changing information that forms the basis of a displayed screen | |
KR101894060B1 (en) | Advertisement providing server using chatbot | |
US11019020B2 (en) | System and method for generating a user status and authenticating social interactions in a computer network | |
US20160225103A1 (en) | Methods, systems, and computer readable media for determining social compatibility using a selected group | |
US9613148B2 (en) | Method and system for determining property of user in social network platform | |
Rosemary et al. | THE RELATIONSHIP BETWEEN ANONYMITY AND CYBER SEXUAL HARASSMENT BY TWITTER USERS: A CROSS-SECTIONAL STUDY |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |