CN111580670B - Garden landscape implementation method based on virtual reality - Google Patents
Garden landscape implementation method based on virtual reality Download PDFInfo
- Publication number
- CN111580670B CN111580670B CN202010398772.0A CN202010398772A CN111580670B CN 111580670 B CN111580670 B CN 111580670B CN 202010398772 A CN202010398772 A CN 202010398772A CN 111580670 B CN111580670 B CN 111580670B
- Authority
- CN
- China
- Prior art keywords
- user
- line
- neglected
- signal
- acquiring
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0641—Shopping interfaces
- G06Q30/0643—Graphical representation of items or shoppers
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A90/00—Technologies having an indirect contribution to adaptation to climate change
- Y02A90/10—Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Accounting & Taxation (AREA)
- Finance (AREA)
- Human Computer Interaction (AREA)
- Development Economics (AREA)
- Economics (AREA)
- Marketing (AREA)
- Strategic Management (AREA)
- General Business, Economics & Management (AREA)
- Eye Examination Apparatus (AREA)
Abstract
The invention discloses a virtual reality-based garden landscape implementation method, and relates to the technical field of virtual reality. In the invention, during the watching process of the user, the interest degree of the user is monitored to obtain a sight range interval; continuously acquiring a sight line range section Fi of a user, and analyzing the sight line range section Fi to generate a corresponding neglect signal when the user ignores the section; then, judging the interesting value of the user according to the neglected signal; when the tired signal is generated, pushing the next recommended scenic spot to the user; the recommendation method is to automatically analyze potential garden areas possibly interested by a user by means of big data; the method can monitor the satisfaction degree and concentration degree of the user on the gardens being watched in real time, and automatically recommend new gardens for the user to watch when the user tired on the corresponding gardens; the invention is simple and effective, and is easy and practical.
Description
Technical Field
The invention belongs to the technical field of virtual reality, and particularly relates to a garden landscape implementation method based on virtual reality.
Background
Patent publication number CN106296250a discloses a real estate landscape roaming method and system combining a bicycle with a virtual reality display. Comprises a touch integrated machine, a head-mounted virtual reality display active position tracking camera, a head-mounted virtual reality display, a bicycle Bluetooth transmitting meter, a bicycle and a mobile phone end, wherein the touch integrated machine comprises a host end and a touch display screen, the current position and the sight angle of a user can be monitored in real time through information transmission among the bicycle, the mobile phone end and the touch integrated machine, the house property landscape information is displayed, the user has an immersive feeling, the reality and comfort of the user are enhanced through the arrangement of the bicycle, the body-building function is provided, the virtual reality product has perfect immersion and interactivity compared with the past virtual reality product, as a building selling tool, the novel house buyer can feel the landscape of the project to be developed by the house manufacturer in an immersive manner at the building selling place.
At present, along with the high-speed development of virtual reality technology, more and more cloud visitors can enjoy gardens by means of other technologies; however, there is currently no method for determining whether a person is interested in a corresponding garden by subjective help of the person's status, and if not, recommending a related popular garden autonomously; to achieve this, a solution is now provided.
Disclosure of Invention
The invention aims to provide a virtual reality-based garden landscape implementation method, which is characterized in that in the watching process of a user, the interestingness of the user is monitored, the tiredness degree of the user is judged according to neglect interval signals and time intervals Cj, and the tiredness signals of the user are obtained; when tired signals are generated, the next recommended scenic spot is pushed to the user, whether the person is interested in the corresponding garden or not can be judged subjectively by means of the person condition, and when the person is not interested, the person can be independently recommended to the related interesting hot gardens.
In order to solve the technical problems, the invention is realized by the following technical scheme:
the invention discloses a virtual reality-based garden landscape implementation method, which comprises the following steps:
step one: firstly, a user inputs garden information expected to be watched;
step two: acquiring a three-dimensional model corresponding to garden information, and watching by a user through AR equipment;
step three: in the watching process of the user, the user is subjected to interestingness monitoring, and the specific monitoring mode is as follows:
s1: when a user wears the AR device, acquiring a straight line at the height of eyes, and marking the straight line as a positive line of sight; connecting two earlobe points, wherein the earlobe points are the lowest points of a left ear and a right ear, and are respectively marked as a left lobe point and a right lobe point correspondingly, and connecting the left lobe point and the right lobe point to form an ear perpendicular line; acquiring a straight line parallel to the earlobe line and positioned at the height of the eyes, and marking the straight line as a front line;
s2: marking a head-up plane where a front view line is located as a reference plane, wherein the head-up plane is parallel to a horizontal plane, obtaining a line perpendicular to the front view line in the reference plane, and marking the line as a target straight line; the eye line is positioned in front of the AR device;
s3: obtaining an area range forming an included angle X1 with a target straight line in a reference plane to obtain a sight line range, wherein the included angle takes the target straight line as an angular bisector, and X1 is a preset value;
s4: continuously acquiring a sight line range interval Fi of a user, analyzing the sight line range interval Fi, acquiring a plurality of neglected interval signals, and calculating a time interval Cj between each neglected interval signal, wherein i=1, 2, 3, i, j=1, 2, 3, i, m-1;
step four: judging the tiredness degree of the user according to the neglected interval signal and the time interval Cj, and acquiring a tiredness signal of the user;
step five: when the tired signal is generated, pushing the next recommended scenic spot to the user;
step six: recommending the recommended scenic spot to the user to choose to watch or the user to choose to stop watching.
Further, the recommended scenic spot obtaining mode is as follows:
SS1: all the watched sceneries are obtained, the sceneries watched by the user are removed, and the rest sceneries are marked as Qp, i=1, 2, 3, & gt, k;
SS2: acquiring the watched times and the last watching time of the corresponding scenic spot; the corresponding viewing times are marked Kp, the last viewing time is marked Zp, p=1, 2, 3, & gt, k;
SS3: calculating an expected value Dp by using a formula, dp=0.631×kp+0.369×zp, wherein 0.631 and 0.369 are preset weights;
SS4: the sceneries with the top three Di values are marked as recommended sceneries.
Further, in the fourth step, the method for determining the tiredness degree of the user is as follows:
s01: first, the value of m is determined, when m<X 2 When not doing any treatment, X 2 Is a preset value;
s02: when m is greater than or equal to X 2 In this case, the tolerance value is calculated as follows:
s03: obtain X 2 The time interval Cj, j=1 between the neglected section signals,2、3、...、 X 2 -1;
S04: calculate X 2 -a mean value of the time intervals of group 1, the mean value being marked as tolerance value;
when the tolerance value exceeds X3, a tired signal is generated, and X3 is a preset value;
otherwise, continuously acquiring an neglected section signal Hj;
s05: when a new neglected section signal Hj is added, i.e. j=x 2 When +1, the neglected section signal H1 corresponding to j=1 is removed, only X2 neglected section signals are obtained, and steps S03 to S04 are repeated.
Further, in the step S4, the time interval Cj between each of the neglected section signals is calculated as follows:
s401: setting time T1, and acquiring a sight range interval of a user once at intervals of T1;
s402: sequentially marking the first acquired sight line range interval and the subsequent continuously acquired sight line range intervals as Fi, i=1, 2, 3, & gt, n;
s403: the difference area between fi+1 and Fi is obtained, i=1, 2, 3, & gt, n, fn represent the latest sight range interval; when the difference area exceeds X1, generating an neglected interval signal, wherein X1 is a preset value;
s404: step S403 is repeated to monitor the user to obtain all the neglected section signals Hj, j=1, 2, 3..m, and to obtain the time interval Cj, j=1, 2, 3..m-1 between each neglected section signal; where Cj represents the time interval between Hj and hj+1.
Further, the AR device is AR glasses or an AR helmet.
Further, the positive line of sight acquiring manner in the step S1 is as follows: when a user wears the AR equipment, connecting two earlobe points, wherein the earlobe points are the lowest points of a left ear and a right ear and are respectively marked as a left lobe point and a right lobe point correspondingly, and connecting the left lobe point and the right lobe point to form an ear perpendicular line; a straight line parallel to the earlobe line and at eye level is obtained and marked as a line of emmetropia.
The invention has the following beneficial effects:
the invention realizes the combination of data by using the garden landscape method; firstly, inputting garden information expected to be watched during specific work, and watching by means of AR equipment; during the watching process of a user, the interest degree of the user is monitored, and the specific monitoring principle is that when the user wears the AR equipment, a relevant front view line and a relevant eye line are defined; the eye line is positioned in front of the AR device; then, according to the two concepts obtained above, a sight line range section can be obtained;
continuously acquiring a sight line range section Fi of a user, and analyzing the sight line range section Fi to generate a corresponding neglect signal when the user ignores the section; then, according to the neglected signal, judging the interesting value of the user; pushing the next recommended scenic spot to the user when the tired signal is generated; the recommendation method is to automatically analyze potential garden areas possibly interested by a user by means of big data; the method can monitor the satisfaction degree and the concentration degree of the user on the gardens being watched in real time, and automatically recommend new gardens for the user to watch when the user tired on the corresponding gardens; the invention is simple and effective, and is easy and practical.
Of course, it is not necessary for any one product to practice the invention to achieve all of the advantages set forth above at the same time.
Detailed Description
The technical solutions of the embodiments of the present invention will be clearly and completely described below in conjunction with the embodiments of the present invention, and it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The invention discloses a virtual reality-based garden landscape implementation method, which comprises the following steps:
step one: firstly, a user inputs garden information expected to be watched;
step two: acquiring a three-dimensional model corresponding to garden information, and watching by a user through AR equipment;
step three: in the watching process of the user, the user is subjected to interestingness monitoring, and the specific monitoring mode is as follows:
s1: when a user wears the AR device, acquiring a straight line at the height of eyes, and marking the straight line as a positive line of sight; connecting two earlobe points, wherein the earlobe points are the lowest points of the left ear and the right ear, and are respectively marked as a left lobe point and a right lobe point correspondingly, and connecting the left lobe point and the right lobe point to form an ear perpendicular line; acquiring a straight line parallel to the earlobe line and positioned at the eye height, marking the straight line as a front sight line, and taking the straight line parallel to the earlobe line and positioned at the eye height of one eye as the front sight line if the eye heights are inconsistent or are not parallel to the perpendicular to the ear;
s2: marking a plane of elevation of a line of elevation as a reference plane, wherein the plane of elevation is parallel to a horizontal plane, obtaining a line perpendicular to the line of elevation in the reference plane, and marking the line as a target straight line; the eye line is positioned in front of the AR device;
s3: obtaining a region range forming an included angle X1 with a target straight line in a reference plane to obtain a line-of-sight range section, wherein the included angle takes the target straight line as an angular bisector, and X1 is a preset value;
s4: continuously acquiring a sight line range interval Fi of a user, analyzing the sight line range interval Fi, acquiring a plurality of neglected interval signals, and calculating a time interval Cj between each neglected interval signal, wherein i=1, 2, 3, i, j=1, 2, 3, i, m-1;
step four: judging the tiredness degree of the user according to the neglected interval signal and the time interval Cj, and acquiring a tiredness signal of the user;
step five: when the tired signal is generated, pushing the next recommended scenic spot to the user;
step six: recommending the recommended scenic spot to the user to choose to watch or the user to choose to stop watching.
The recommended scenic spot acquisition mode is as follows:
SS1: all the watched sceneries are obtained, the sceneries watched by the user are removed, and the rest sceneries are marked as Qp, i=1, 2, 3, & gt, k;
SS2: acquiring the watched times and the last watching time of the corresponding scenic spot; the corresponding viewing times are marked Kp, the last viewing time is marked Zp, p=1, 2, 3, & gt, k;
SS3: calculating an expected value Dp by using a formula, dp=0.631×kp+0.369×zp, wherein 0.631 and 0.369 are preset weights;
SS4: the sceneries with the top three Di values are marked as recommended sceneries.
The method for judging the tiredness degree of the user in the fourth step is as follows:
s01: first, the value of m is determined, when m<X 2 When not doing any treatment, X 2 Is a preset value;
s02: when m is greater than or equal to X 2 In this case, the tolerance value is calculated as follows:
s03: obtain X 2 The time intervals Cj, j=1, 2, 3, & gt, X between the neglected section signals 2 -1;
S04: calculate X 2 -a mean value of the time intervals of group 1, the mean value being marked as tolerance value;
when the tolerance value exceeds X3, a tired signal is generated, and X3 is a preset value;
otherwise, continuously acquiring an neglected section signal Hj;
s05: when a new neglected section signal Hj is added, i.e. j=x 2 When +1, the neglected section signal H1 corresponding to j=1 is removed, only X2 neglected section signals are obtained, and steps S03 to S04 are repeated.
The time interval Cj between each neglected section signal in step S4 is calculated as follows:
s401: setting time T1, and acquiring a sight range interval of a user once at intervals of T1;
s402: sequentially marking the first acquired sight line range interval and the subsequent continuously acquired sight line range intervals as Fi, i=1, 2, 3, & gt, n;
s403: the difference area between fi+1 and Fi is obtained, i=1, 2, 3, & gt, n, fn represent the latest sight range interval; when the difference area exceeds X1, generating an neglected interval signal, wherein X1 is a preset value;
s404: step S403 is repeated to monitor the user to obtain all the neglected section signals Hj, j=1, 2, 3..m, and to obtain the time interval Cj, j=1, 2, 3..m-1 between each neglected section signal; where Cj represents the time interval between Hj and hj+1.
Wherein, the AR equipment is AR glasses or AR helmets.
The positive sight line acquiring method in step S1 is as follows: when a user wears the AR equipment, connecting two earlobe points, wherein the earlobe points are the lowest points of a left ear and a right ear, and are respectively marked as a left lobe point and a right lobe point correspondingly, and connecting the left lobe point and the right lobe point to form an ear perpendicular line; a straight line parallel to the earlobe line and at eye level is obtained and marked as a line of emmetropia.
A garden landscape realizing method based on virtual reality realizes the combination of data by using the garden landscape method; firstly, in specific work, garden information expected to be watched is recorded, and the information is watched by means of AR equipment; during the watching process of a user, the interest degree of the user is monitored, and the specific monitoring principle is that when the user wears the AR equipment, a relevant front view line and a relevant eye line are defined; the eye line is positioned in front of the AR device; then, according to the two concepts obtained above, a sight line range section can be obtained; continuously acquiring a sight line range section Fi of a user, and analyzing the sight line range section Fi to generate a corresponding neglect signal when the user ignores the section; then, according to the neglect signal, judging the interesting value of the user; pushing the next recommended scenic spot to the user when the tired signal is generated; the recommendation method is to automatically analyze potential scenic spots possibly interested by a user by means of big data; the method can monitor the satisfaction degree and concentration degree of the user on the gardens being watched in real time, and automatically recommend new gardens for the user to watch when the user tired on the corresponding gardens; the invention is simple and effective, and is easy and practical.
In the description of the present specification, reference to the terms "one embodiment," "example," "specific example," and the like, means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present invention. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiments or examples. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
The preferred embodiments of the invention disclosed above are intended only to assist in the explanation of the invention. The preferred embodiments are not exhaustive or to limit the invention to the precise form disclosed. Obviously, many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in detail in order to best explain the principles of the invention and the practical application, to thereby enable others skilled in the art to best understand and utilize the invention. The invention is limited only by the claims and the full scope and equivalents thereof.
Claims (4)
1. The garden landscape realizing method based on virtual reality is characterized by comprising the following steps:
step one: firstly, a user inputs garden information expected to be watched;
step two: acquiring a three-dimensional model corresponding to garden information, and watching by a user through AR equipment;
step three: in the watching process of the user, the user is subjected to interestingness monitoring, and the specific monitoring mode is as follows:
s1: when a user wears the AR equipment, acquiring a straight line positioned at the height of eyes, and marking the straight line as a front line; connecting two earlobe points, wherein the earlobe points are the lowest points of a left ear and a right ear, and are respectively marked as a left lobe point and a right lobe point correspondingly, and connecting the left lobe point and the right lobe point to form an ear perpendicular line; acquiring a straight line parallel to the earlobe line and positioned at the height of the eyes, and marking the straight line as a front line;
s2: marking a head-up plane where a front view line is located as a reference plane, wherein the head-up plane is parallel to a horizontal plane, obtaining a line perpendicular to the front view line in the reference plane, and marking the line as a target straight line; the eye line is positioned in front of the AR device;
s3: obtaining an area range forming an included angle X1 with a target straight line in a reference plane to obtain a sight line range, wherein the included angle takes the target straight line as an angular bisector, and X1 is a preset value;
s4: continuously acquiring a sight line range interval Fi of a user, analyzing the sight line range interval Fi, acquiring a plurality of neglected interval signals, and calculating a time interval Cj between each neglected interval signal, wherein i=1, 2, 3, n, j=1, 2, 3, m-1;
step four: judging the tiredness degree of the user according to the neglected interval signal and the time interval Cj, and acquiring a tiredness signal of the user; in the fourth step, the method for judging the tiredness degree of the user is as follows:
s01: first, the value of m is determined, when m<X 2 When not doing any treatment, X 2 Is a preset value;
s02: when m is greater than or equal to X 2 In this case, the tolerance value is calculated as follows:
s03: obtain X 2 The time intervals Cj, j=1, 2, 3, & gt, X between the neglected section signals 2 -1;
S04: calculate X 2 -a mean value of the time intervals of group 1, the mean value being marked as tolerance value;
when the tolerance value exceeds X3, a tired signal is generated, and X3 is a preset value;
otherwise, continuously acquiring an neglected section signal Hj;
s05: when a new neglected section signal Hj is added, i.e. j=x 2 When +1, remove the corresponding neglected section signal H1 of j=1, only obtain X2 neglected section signals, repeat step S03-S04;
step five: when the tired signal is generated, pushing the next recommended scenic spot to the user;
step six: recommending the recommended scenic spot to the user to choose to watch or the user to choose to stop watching;
the time interval Cj between each of the neglected section signals in the step S4 is calculated as follows:
s401: setting time T1, and acquiring a sight range interval of a user once at intervals of T1;
s402: sequentially marking the sight line range sections acquired for the first time and the sight line range sections acquired continuously later as Fi, i=1, 2, 3, & gt, n;
s403: acquiring a difference area between fi+1 and Fi, wherein i=1, 2, 3, & gt, n and Fn represent the latest sight range interval; when the difference area exceeds X1, generating an neglected interval signal, wherein X1 is a preset value;
s404: step S403 is repeated to monitor the user to obtain all the neglected section signals Hj, j=1, 2, 3..m, and to obtain the time interval Cj, j=1, 2, 3..m-1 between each neglected section signal; where Cj represents the time interval between Hj and hj+1.
2. The method for realizing the garden landscape based on the virtual reality according to claim 1, wherein the recommended scenic spot obtaining mode is as follows:
SS1: all the watched sceneries are obtained, the sceneries watched by the user are removed, and the rest sceneries are marked as Qp, i=1, 2, 3, & gt, k;
SS2: acquiring the watched times and the last watching time of the corresponding scenic spot; the corresponding viewing times are marked Kp, the last viewing time is marked Zp, p=1, 2, 3, & gt, k;
SS3: calculating an expected value Dp by using a formula, dp=0.631×kp+0.369×zp, wherein 0.631 and 0.369 are preset weights;
SS4: the sceneries with the top three Di values are marked as recommended sceneries.
3. The virtual reality-based landscape architecture implementation method of claim 1, wherein the AR device is AR glasses or an AR helmet.
4. The method for realizing the garden landscape based on the virtual reality according to claim 1, wherein the method for obtaining the forward sight line in the step S1 is as follows:
when a user wears the AR equipment, connecting two earlobe points, wherein the earlobe points are the lowest points of a left ear and a right ear and are respectively marked as a left lobe point and a right lobe point correspondingly, and connecting the left lobe point and the right lobe point to form an ear perpendicular line; a straight line parallel to the earlobe line and at eye level is obtained and marked as a line of emmetropia.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010398772.0A CN111580670B (en) | 2020-05-12 | 2020-05-12 | Garden landscape implementation method based on virtual reality |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010398772.0A CN111580670B (en) | 2020-05-12 | 2020-05-12 | Garden landscape implementation method based on virtual reality |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111580670A CN111580670A (en) | 2020-08-25 |
CN111580670B true CN111580670B (en) | 2023-06-30 |
Family
ID=72113501
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010398772.0A Active CN111580670B (en) | 2020-05-12 | 2020-05-12 | Garden landscape implementation method based on virtual reality |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111580670B (en) |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1996031047A2 (en) * | 1995-03-31 | 1996-10-03 | The Regents Of The University Of California | Immersive video |
KR20020014673A (en) * | 2000-08-16 | 2002-02-25 | 양윤원 | Researching Method and Researching System For Interests in Commercial Goods By Using Electronic Catalog Including Interactive 3D Image data |
WO2008081412A1 (en) * | 2006-12-30 | 2008-07-10 | Kimberly-Clark Worldwide, Inc. | Virtual reality system including viewer responsiveness to smart objects |
DE102016003073A1 (en) * | 2016-03-12 | 2017-09-14 | Audi Ag | Method for operating a virtual reality system and virtual reality system |
WO2018018195A1 (en) * | 2016-07-24 | 2018-02-01 | 严映军 | Method for pushing information during interest determining and determining system |
WO2018032790A1 (en) * | 2016-08-16 | 2018-02-22 | 武汉斗鱼网络科技有限公司 | Weighted k-nearest-neighbor scoring-based live broadcast room recommendation method and system |
JP2018055428A (en) * | 2016-09-29 | 2018-04-05 | 株式会社東芝 | Interest maintaining system and server |
US10334134B1 (en) * | 2016-06-20 | 2019-06-25 | Maximillian John Suiter | Augmented real estate with location and chattel tagging system and apparatus for virtual diary, scrapbooking, game play, messaging, canvasing, advertising and social interaction |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4687266B2 (en) * | 2004-10-20 | 2011-05-25 | 富士ゼロックス株式会社 | Alerting device and method, and information processing system |
JP2007048171A (en) * | 2005-08-12 | 2007-02-22 | Fujitsu Ltd | Recommended exhibition booth selection program and recommended exhibition booth selection method |
JP4686299B2 (en) * | 2005-08-17 | 2011-05-25 | パナソニック株式会社 | Usability evaluation apparatus, usability evaluation method and program |
US10410421B2 (en) * | 2016-09-13 | 2019-09-10 | 3I, Corporation | Method and server for providing virtual reality image about object |
CN108958460A (en) * | 2017-05-19 | 2018-12-07 | 深圳市掌网科技股份有限公司 | Building sand table methods of exhibiting and system based on virtual reality |
JP6298561B1 (en) * | 2017-05-26 | 2018-03-20 | 株式会社コロプラ | Program executed by computer capable of communicating with head mounted device, information processing apparatus for executing the program, and method executed by computer capable of communicating with head mounted device |
CN207164700U (en) * | 2017-08-29 | 2018-03-30 | 歌尔科技有限公司 | A kind of helmet |
US11556980B2 (en) * | 2017-11-17 | 2023-01-17 | Ebay Inc. | Method, system, and computer-readable storage media for rendering of object data based on recognition and/or location matching |
CN108804583A (en) * | 2018-05-25 | 2018-11-13 | 武汉市华太培文教育科技有限公司 | The system and method for Literature pushing is carried out based on user's reading interest |
CN109656441B (en) * | 2018-12-21 | 2020-11-06 | 广州励丰文化科技股份有限公司 | Navigation method and system based on virtual reality |
-
2020
- 2020-05-12 CN CN202010398772.0A patent/CN111580670B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1996031047A2 (en) * | 1995-03-31 | 1996-10-03 | The Regents Of The University Of California | Immersive video |
KR20020014673A (en) * | 2000-08-16 | 2002-02-25 | 양윤원 | Researching Method and Researching System For Interests in Commercial Goods By Using Electronic Catalog Including Interactive 3D Image data |
WO2008081412A1 (en) * | 2006-12-30 | 2008-07-10 | Kimberly-Clark Worldwide, Inc. | Virtual reality system including viewer responsiveness to smart objects |
DE102016003073A1 (en) * | 2016-03-12 | 2017-09-14 | Audi Ag | Method for operating a virtual reality system and virtual reality system |
US10334134B1 (en) * | 2016-06-20 | 2019-06-25 | Maximillian John Suiter | Augmented real estate with location and chattel tagging system and apparatus for virtual diary, scrapbooking, game play, messaging, canvasing, advertising and social interaction |
WO2018018195A1 (en) * | 2016-07-24 | 2018-02-01 | 严映军 | Method for pushing information during interest determining and determining system |
WO2018032790A1 (en) * | 2016-08-16 | 2018-02-22 | 武汉斗鱼网络科技有限公司 | Weighted k-nearest-neighbor scoring-based live broadcast room recommendation method and system |
JP2018055428A (en) * | 2016-09-29 | 2018-04-05 | 株式会社東芝 | Interest maintaining system and server |
Non-Patent Citations (2)
Title |
---|
司亚利 ; 张付志 ; 刘文远 ; .基于签到活跃度和时空概率模型的自适应兴趣点推荐方法.电子与信息学报.2020,(第03期),全文. * |
谭云兰 ; 汤鹏杰 ; 康永平 ; 夏洁武 ; .庐陵古村全景漫游系统的设计与实现.电脑知识与技术.2018,(第28期),全文. * |
Also Published As
Publication number | Publication date |
---|---|
CN111580670A (en) | 2020-08-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109976690B (en) | AR glasses remote interaction method and device and computer readable medium | |
US20040041822A1 (en) | Image processing apparatus, image processing method, studio apparatus, storage medium, and program | |
EP3335070B1 (en) | Methods for optimizing positioning of content on a screen of a head mounted display | |
CN106464854B (en) | Image encodes and display | |
US5495576A (en) | Panoramic image based virtual reality/telepresence audio-visual system and method | |
CN104536579B (en) | Interactive three-dimensional outdoor scene and digital picture high speed fusion processing system and processing method | |
US20170116788A1 (en) | New pattern and method of virtual reality system based on mobile devices | |
CN107678715A (en) | The sharing method of virtual information, device and system | |
JP2012128854A (en) | Mixed reality display platform for presenting enhanced type three-dimensional stereoscopic video, and operating method | |
KR101962578B1 (en) | A fitness exercise service providing system using VR | |
JP7471572B2 (en) | CALIBRATION SYSTEM AND CALIBRATION METHOD | |
Kim et al. | Multimodal interactive continuous scoring of subjective 3D video quality of experience | |
CN205195880U (en) | Watch equipment and watch system | |
CN107103645A (en) | virtual reality media file generation method and device | |
CN107623812A (en) | A kind of method, relevant apparatus and system for realizing outdoor scene displaying | |
JP2003244728A (en) | Virtual image creating apparatus and virtual image creating method | |
JP6024159B2 (en) | Information presenting apparatus, information presenting system, server, information presenting method and program | |
CN112835449A (en) | Virtual reality and somatosensory device interaction-based safety somatosensory education system | |
CN111580670B (en) | Garden landscape implementation method based on virtual reality | |
Lugrin et al. | Usability benchmarks for motion tracking systems | |
CN104216126A (en) | Zooming 3D (third-dimensional) display technique | |
CN104076910B (en) | The method and electronic equipment of a kind of information processing | |
US11880631B2 (en) | Processing apparatus and immersion level deriving method | |
CN110415354A (en) | Three-dimensional immersive experience system and method based on spatial position | |
CN109474819B (en) | Image presenting method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |