CN108269307B - Augmented reality interaction method and equipment - Google Patents

Augmented reality interaction method and equipment Download PDF

Info

Publication number
CN108269307B
CN108269307B CN201810036004.3A CN201810036004A CN108269307B CN 108269307 B CN108269307 B CN 108269307B CN 201810036004 A CN201810036004 A CN 201810036004A CN 108269307 B CN108269307 B CN 108269307B
Authority
CN
China
Prior art keywords
augmented reality
template object
virtual content
virtual
scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810036004.3A
Other languages
Chinese (zh)
Other versions
CN108269307A (en
Inventor
尹左水
姜滨
迟小羽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Goertek Techology Co Ltd
Original Assignee
Goertek Techology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Goertek Techology Co Ltd filed Critical Goertek Techology Co Ltd
Priority to CN201810036004.3A priority Critical patent/CN108269307B/en
Publication of CN108269307A publication Critical patent/CN108269307A/en
Application granted granted Critical
Publication of CN108269307B publication Critical patent/CN108269307B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides an augmented reality interaction method and equipment, wherein the method comprises the following steps: identifying identification information contained in a template object in a real scene according to the interactive instruction; sending the identification information to a server so that the server can distribute virtual content to the template object according to the identification information; receiving the virtual content returned by the server, and displaying the virtual content at the mapping position of the template object in the virtual scene according to the position mapping relation between the real scene and the virtual scene; and following the interactive action of the template object, performing linkage display on the virtual content. In the embodiment of the application, the user can obtain real touch experience through the template prop, and virtual interaction experience can be obtained through the virtual content, so that the reality and virtuality fusion are really realized, and the game interactivity is greatly improved.

Description

Augmented reality interaction method and equipment
Technical Field
The application relates to the technical field of augmented reality, in particular to an augmented reality interaction method and equipment.
Background
With the development of Virtual Reality technology, more and more technologies are biased toward Virtual implementation, for example, a user may play various games in a Virtual Reality environment, and interact through a handle or a head-mounted VR (Virtual Reality) device to implement game control.
However, the fully virtualized game interaction mode results in that the user cannot find the real playing feeling when playing the game, and the interactivity is poor. The Augmented Reality (AR) technology is a technology for increasing the perception of a user to the real world through information provided by a computer system, and can realize the enhancement of Reality. Therefore, it is necessary to provide an interactive manner based on AR technology to improve interactivity by using the advantages of AR technology.
Disclosure of Invention
Aspects of the present application provide an augmented reality interaction method and apparatus for enhancing a sense of real participation in a game.
The embodiment of the application provides an augmented reality interaction method, which comprises the following steps:
identifying identification information contained in a template object in a real scene to which the first augmented reality device belongs according to the interaction instruction;
sending the identification information to a server so that the server can distribute virtual content to the template object according to the identification information;
receiving the virtual content returned by the server, and displaying the virtual content at the mapping position of the template object in the virtual scene according to the position mapping relation between the real scene and the virtual scene;
and following the interactive action of the template object, performing linkage display on the virtual content.
The embodiment of the present application further provides an augmented reality interaction method, including:
receiving identification information contained in a template object in a real scene sent by augmented reality equipment;
distributing virtual content to the template object according to the identification information;
and sending the virtual content to the augmented reality equipment so that the augmented reality equipment displays the virtual content at the mapping position of the template object in the virtual scene according to the position mapping relation between the real scene and the virtual scene, and displays the virtual content in a linkage manner along with the interaction action of the template object.
Embodiments of the present application further provide an augmented reality device, including a memory and a processor,
the memory for storing a computer program;
the processor, configured to execute a computer program stored in the memory, to:
identifying identification information contained in a template object in a real scene to which augmented reality equipment belongs according to the interactive instruction;
sending the identification information to a server so that the server can distribute virtual content to the template object according to the identification information;
receiving the virtual content returned by the server, and displaying the virtual content at the mapping position of the template object in the virtual scene according to the position mapping relation between the real scene and the virtual scene;
and following the interactive action of the template object, performing linkage display on the virtual content.
Embodiments of the present application also provide a server device, comprising a memory and a processor,
the memory for storing a computer program;
the processor is configured to execute the computer program stored in the memory to:
receiving identification information contained in a template object in a real scene sent by augmented reality equipment;
distributing virtual content to the template object according to the identification information;
and sending the virtual content to the augmented reality equipment so that the augmented reality equipment displays the virtual content at the mapping position of the template object in the virtual scene according to the position mapping relation between the real scene and the virtual scene, and displays the virtual content in a linkage manner along with the interaction action of the template object.
In the embodiment of the application, after the server device allocates the virtual content to the template object in the real scene, the user can see the fused picture of the template object in the real scene and the virtual content in the virtual scene through the augmented reality device, and in the process of augmented reality interaction, the user can synchronously control the virtual content through controlling the template object, so that the user can obtain real tactile experience through the template prop, and can obtain the experience of virtual interaction through the virtual content, thereby really realizing the presence and the fusion of reality and greatly improving the game interactivity.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a schematic flowchart of an augmented reality interaction method according to an embodiment of the present application;
fig. 2 is a schematic flowchart of an augmented reality interaction method according to another embodiment of the present application;
fig. 3 is a schematic flowchart of an augmented reality interaction method according to another embodiment of the present application;
fig. 4 is a schematic flowchart of an augmented reality interaction method according to an embodiment of the present application;
fig. 5 is a schematic flowchart of an augmented reality interaction method according to another embodiment of the present application;
fig. 6 is a schematic flowchart of an augmented reality interaction method according to another embodiment of the present application;
fig. 7 is a schematic structural diagram of an augmented reality device according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of a server device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be described in detail and completely with reference to the following specific embodiments of the present application and the accompanying drawings. It should be apparent that the described embodiments are only a few embodiments of the present application, and not all embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application without making any creative effort belong to the protection scope of the present application.
Aiming at the technical problems that a user cannot find a real playing feeling when playing a game and the interactivity is poor due to a fully virtualized game interaction mode in the prior art, the embodiment of the application provides a solution: virtual content is distributed to the template object in the real scene through the server equipment, and a picture formed by fusing the template object in the real scene and the virtual content in the virtual scene is displayed through the augmented reality equipment. During the game process, the user can synchronously control the virtual content by controlling the template object. Therefore, the user can obtain real touch experience through the template prop and virtual interaction experience through the virtual content, the presence and the fusion of reality and virtuality are really realized, and the game interactivity is greatly improved.
The technical solutions provided by the embodiments of the present application are described in detail below with reference to the accompanying drawings.
Fig. 1 is an augmented reality interaction method provided in an embodiment of the present application. The method can be applied to augmented reality devices of users, such as AR glasses, HUD (Head-up display) or other devices capable of realizing augmented reality functions.
In order to implement augmented reality interaction among multiple users, an augmented reality interaction system may be established in the embodiments of the present application, where each user in the system has at least one augmented reality device, and for each augmented reality device in the augmented reality interaction system, the process of implementing augmented reality interaction by each augmented display device is the same, so the following embodiments of the present application take a first augmented reality device as an example to describe in detail the process of implementing augmented reality interaction by each augmented reality device, and the first augmented reality device may be any augmented reality device in the augmented reality interaction system.
As shown in fig. 1, the method includes:
100. identifying identification information contained in a template object in a real scene to which the first augmented reality device belongs according to the interaction instruction;
101. sending the identification information to a server so that the server can distribute virtual content to the template object according to the identification information;
102. receiving the virtual content returned by the server, and displaying the virtual content at the mapping position of the template object in the virtual scene according to the position mapping relation between the real scene and the virtual scene;
103. and following the interactive action of the template object, performing linkage display on the virtual content.
When a user wears the first augmented reality device to interact with other users, each user has a template object. Template objects may appear in the user's line of sight at any time, but not all times need to be identified. For example, in a floor-fighting game, the first augmented reality device does not recognize the template object before the floor-calling party is played, and the server does not assign virtual content to the template object even if the user sees the template object, and the first augmented reality device held by the floor-calling party can recognize the template object only after the floor-calling party is played, without the first augmented reality device recognizing the template object. In this embodiment, when the first augmented reality device receives the interactive instruction, the identification processing for the template object in the real scene is executed, that is, the identification information included in the template object in the real scene is identified.
In this embodiment, the template object is mainly used as a carrier of the virtual content, and may be any object with a certain information carrying capability. In addition, in order to identify the template object, the template object has unique identification information, on the basis of which different template objects can be distinguished by the identification information of the template object.
In some implementations, the template object in the real scene may be a non-electronic information carrier, such as a physical game item like playing card, chess, item knife, item gun, etc. Correspondingly, the identification information may be a recognizable image such as a two-dimensional code image, a barcode image, or the like, which is disposed on the template object. For example, the contents of these game items may be blank, but carry identifying information. For example, these play objects may be blank playing cards containing two-dimensional codes at the four corners or blank chess containing bar codes at the center.
In other implementations, the template object in the real scene may also be an electronic information carrier, for example, a micro display screen or an electronic device with a micro display screen, such as a mobile phone and a tablet computer. Correspondingly, the identification information is the mac address or the IP address of the electronic information carrier.
After the identification information of the template object is identified, the identification information of the template object is sent to a server so that the server can match virtual content for the template object according to the identification information of the template object. Based on this, the identification information of the template object has a role of matching virtual content for the template object, and the server can allocate virtual content to each template object according to the identification information.
In this embodiment, augmented reality presentation is performed based on the template object and the virtual content assigned to the template object by the server. In order to realize the universality of the template object, the template object is set to be in a unified mode, such as playing cards, and the template object comprises identification information for uniquely identifying identity, in the augmented reality interaction, new content is endowed to the template object through virtual content, for example, when the template object is the playing cards, in a game, the server distributes the virtual content to the playing cards, so that each playing card obtains corresponding suit and number; in another game, the server distributes virtual content to the playing cards again, each playing card may obtain different patterns and numbers from the previous playing cards, and the user can present a plurality of patterns and numbers according to the game content only by using a limited number of playing cards without frequently replacing props.
In this embodiment, a camera, for example, a depth camera, may be disposed on the first augmented reality device, and is used to identify the template object and scan the user's interaction. In step 103, the virtual content is tracked and displayed at the mapping position of the template object in the virtual scene according to the position mapping relationship between the real scene and the virtual scene, so that the virtual content can be displayed in a linkage manner along with the interaction of the user on the template object. For example, when a user holds a template object for panning, the virtual content will follow the template object to perform a synchronized panning. For another example, when the user throws the template object out of the field of view, the virtual content disappears within the field of view. Visually, the user may perceive that the virtual content and the template object move as a whole within the field of view.
In this embodiment, the server device allocates virtual content to the template object in the real scene, and the first augmented reality device presents the image obtained by superimposing the template object in the real scene and the virtual content in the virtual scene. During the game process, the user can synchronously control the virtual content by controlling the template object. Therefore, the user can obtain real touch experience through the template prop and virtual interaction experience through the virtual content, the presence and the fusion of reality and virtuality are really realized, and the game interactivity is greatly improved.
In the above or following embodiments, one implementation of step 100 may be:
monitoring an identification permission opening notice sent by the server;
when the identification permission starting notice is monitored, detecting whether a real scene contains the template object;
if the template object is detected, identifying the identification information contained in the template object; if the template object is not detected, whether the template object is contained in the real scene or not can be continuously detected until the template object is detected.
The identification permission opening notification sent by the server may be a game starting notification, an interaction starting notification, a camera opening instruction, and the like. For example, when the first augmented reality device monitors an instruction sent by the server to turn on the AR camera, the camera is turned on, and once the camera is turned on, the first augmented reality device automatically recognizes the identification information of the template object in the field of view.
Of course, besides the above implementation, the interaction instruction may be issued by the user, and the interaction instruction may be an instruction actually issued by the user through voice, physical key or touch, for example, the user calls out a "start game" voice, the user gazes at a "start" button in the virtual scene, and so on.
It should be noted that the implementation of the above-mentioned interactive instruction is merely exemplary, and should not be taken as a specific limitation to the interactive instruction of the present application. According to different practical application situations, the interactive instruction can adopt other implementation modes.
In the above or below embodiments, in order to make the user feel the sense of face-to-face interaction, the first augmented reality device may acquire scene pictures and/or voice data in the real scene in real time; and sending the scene picture and/or the voice data to the server so that the server synchronizes the scene picture and/or the voice data to other augmented reality devices interacting with the first augmented reality device. Similarly, other augmented reality devices receiving the server feedback can receive and synchronously display the scene picture and/or the voice data.
In this embodiment, a lens may be disposed on the first augmented reality device, so as to capture a scene of a real scene where the user is located. When the user interacts with other users in a connecting line, the scene picture of the other party can be presented in the virtual scene.
In order to facilitate online communication between users, in this embodiment, the first augmented reality device may further include an audio component, collect voice data in real time, and perform voice synchronization through the server device, thereby implementing online communication.
The user can see the fused picture of the own template object and the virtual content through the first augmented reality device, and in order to obtain the experience of face-to-face interaction with other users, the virtual content corresponding to other users can be synchronously displayed in the first augmented reality device, so that the interaction experience of multiple users in a virtual-real blended environment can be obtained.
To this end, in the above or the following embodiments, after step 102, the method further comprises:
displaying an interactive interface, wherein the interactive interface at least comprises an interactive control, a template object in a real scene to which the first augmented reality device belongs, virtual content in the virtual scene, and template objects in real scenes to which other augmented reality devices interacting with the first augmented reality device belong;
and responding to the operation of the user on the interaction control, and sending a virtual content sharing notification to the server so that the server can send the sharable virtual content to other augmented reality equipment interacting with the first augmented reality equipment according to the virtual content sharing notification to be displayed.
In this embodiment, the interactive interface is used to show the images of the virtual-real blend of the own party and the images of the virtual-real blend of the opposite party to the user. The interactive interface is also used for providing an interactive control, so that the user can realize interactive control through the operation of the interactive control.
In an actual application, an interactive interface can comprise a self display area and an opposite display area, wherein the self display area can display a picture formed by fusing a template object in a real scene to which a user belongs and a virtual object in a virtual environment, and the self display area can also display an interactive control; the opposite side display area can display character pictures of the opposite side user, template objects in a real scene where the opposite side user belongs or sharable virtual contents corresponding to the opposite side user.
For example, in the ground-fighting host game, the template object can be paper playing cards, the own display area can display the playing cards and virtual suits and numbers, the own display area can also display interactive keys such as 'playing cards', 'passing cards' and the like, the opposite display area can display the playing cards which are not played in the hands of the opposite user, and can also display the playing cards which are played by the opposite user and the virtual suits and numbers corresponding to the played playing cards. When a user sees playing cards and corresponding virtual suits and numbers played by the opposite user in the opposite display area, the user can determine cards to be played next, after selection, the user can play the cards by gazing at a 'playing card' key, at the moment, based on the cards just played, a virtual content sharing notice can be sent to a server, the virtual suits and numbers corresponding to the cards just played are shared to augmented reality equipment of other users, and therefore the other users can see the virtual suits and numbers in the opposite display area.
According to the embodiment, the sharing permission of the virtual content is controlled according to the operation of the user on the interactive control, so that the sharing of the virtual content among the users can be realized, multiple users can obtain the feeling of being in a virtual-real combined interactive environment, and the interactivity is improved.
Fig. 2 is an augmented reality interaction method provided in another embodiment of the present application, and as shown in fig. 2, the method includes:
200. identifying identification information contained in a template object in a real scene to which the first augmented reality device belongs according to the interaction instruction;
201. sending the identification information to a server so that the server can distribute virtual content to the template object according to the identification information;
202. receiving the virtual content returned by the server;
203. acquiring position information of the template object in the real scene;
204. and tracking and displaying the virtual content at the mapping position of the template object in the virtual scene according to the position information of the template object in the real scene and the position mapping relation between the real scene and the virtual scene.
205. And following the interactive action of the template object, performing linkage display on the virtual content.
For the description of steps 200-202, 205, reference is made to the foregoing embodiments, and the description is omitted here.
In this embodiment, in order to display the virtual content more accurately and obtain a better visual effect, when the virtual content returned by the server is received, the virtual content is not randomly displayed in the virtual scene, but the position information of the template object in the real scene is first acquired, and 5 the position mapping relationship between the template object and the virtual content is then calculated according to the position mapping relationship between the real scene and the virtual scene, so that the virtual content can be tracked and displayed at the mapping position of the template object in the virtual scene according to the position information of the template object in the real scene and the position mapping relationship between the template object and the virtual content.
When the virtual content is tracked and displayed at the mapping position of the template object in the virtual scene, the visual effect is best for the user, for example, for poker, the visual habit of the user is best met when the flowers and numbers are displayed at the positions of the four corners, and for chess, the name of the chess piece is best displayed at the central position. In step 205, a coordinate position of the identification information in the real scene may be first obtained, and then, according to the coordinate position of the identification information in the real scene and the position mapping relationship between the real scene and the virtual scene, the coordinate position of the identification information in the virtual scene is determined, and the virtual content is tracked and displayed at the coordinate position of the identification information in the virtual scene. The coordinate position of the identification information in the real scene can be obtained by calculation according to the setting position of the identification information on the template object and the position information of the template object in the real scene.
In order to obtain better visual experience, the display scale of the virtual content can be adjusted according to the moving position of the template object. For example, when the displacement of the template object in the Z-axis direction is monitored, the scaling of the virtual content may be calculated according to the displacement of the template object. Specifically, the scale of the virtual content can be enlarged when the template object moves to the positive direction of the Z axis, and the scale of the virtual content can be reduced when the template object moves to the negative direction of the Z axis, so that the visual difference caused by the fact that the template object is far away from human eyes can be adapted.
Fig. 3 is an augmented reality interaction method provided in another embodiment of the present application, and as shown in fig. 3, the method includes:
300. identifying identification information contained in a template object in a reality scene to which the first augmented reality device belongs according to the interactive instruction;
301. sending the identification information to a server so that the server can distribute virtual content to the template object according to the identification information;
302. collecting the contour features of the template object;
303. sending the outline characteristics of the template object to the server so that the server can generate a virtual object according to the outline characteristics and the virtual content;
304. receiving the virtual object returned by the server, and displaying the virtual object in a tracking manner at the mapping position of the template object in the virtual scene according to the position mapping relation between the real scene and the virtual scene;
305. and performing linkage display on the virtual object along with the interactive action on the template object.
For the description of steps 300-301, 305, reference is made to the foregoing embodiments, and the description is omitted here.
In this embodiment, in order to enhance the sense of virtualization, the server device constructs a virtual object according to the contour feature of the template object and the virtual content. The virtual object may be a 3D model or a two-dimensional image model. The contour features of the template object may be obtained by image recognition by an image recognition component on the first augmented reality device. For example, the first augmented reality device acquires contour features of playing cards and uploads the contour features to the server device, the server device constructs a 3D playing card model according to the contour features of the playing cards, and the suit and the number are drawn on the 3D playing card model according to identification information of the template object.
In this embodiment, the virtual content may be tracked and displayed at the mapping position of the template object in the virtual scene according to the position information of the template object in the real scene and the position mapping relationship between the real scene and the virtual scene. When the virtual object is tracked and displayed on the template object, the virtual content can be tracked and displayed at the position of the identification information of the template object according to the manner provided in the foregoing embodiment, and can also be tracked and displayed at other positions of the virtual object as required.
Certainly, the virtual object may include other model elements in addition to the virtual content, and other model elements may be rendered for the virtual object according to the attribute of the virtual object, for example, for the virtual object generated by the prop knife, after the basic 3D knife model is constructed according to the profile feature of the prop knife, according to the attribute of the prop in the game, a knife handle accessory or a knife sleeve line is added to the basic 3D knife model, so as to enrich the game picture.
In this embodiment, the virtual object generated according to the contour feature of the template object and the virtual content is tracked and displayed on the template object, and when a user interacts with the template object in the augmented reality interaction process, a feeling that the user directly performs the interaction on the virtual object in the virtual environment can be visually obtained, which is beneficial to enhancing the virtualization.
Fig. 4 is an augmented reality interaction method provided in an embodiment of the present application, where the method is applicable to a server device, and the method includes:
400. receiving identification information contained in a template object in a real scene sent by augmented reality equipment;
401. distributing virtual content to the template object according to the identification information;
402. and sending the virtual content to the augmented reality equipment so that the augmented reality equipment tracks and displays the virtual content at the mapping position of the template object in the virtual scene according to the position mapping relation between the real scene and the virtual scene, and performs linkage display on the virtual content along with the interaction action of the template object.
In this embodiment, in the process of interaction of multiple users through the augmented reality device, each user hand holds a template object, the template object may be the above-mentioned non-electronic carrier, such as playing cards, and each user hand may hold a plurality of blank playing cards; or an electronic carrier as described above, such as a micro-display, which can be held in the hand of each user. The template object is mainly used as a bearer of the virtual content and may be any object having a certain information carrying capability. In addition, in order to identify the template object, the template object has unique identification information, based on which different template objects can be distinguished by the identification information of the template object. And when the server terminal receives the identification information contained in the template object in the real scene sent by the augmented reality equipment, the server terminal distributes virtual content to the template object. The server device can distribute virtual content according to preset game rules, for example, for card games, the server device can distribute flowers and numbers to playing cards according to card touching sequence and two-dimensional codes on the playing cards; the server device may also allocate the virtual content according to a preset correspondence between the identification information and the virtual content, for example, for the prop knife, the server may allocate a knife name corresponding to the two-dimensional code on the prop knife to the prop knife according to the two-dimensional code on the prop knife.
The server device can pre-store a plurality of virtual contents, and record the corresponding relation between the identification information of the template prop and the virtual contents after distributing the virtual contents for the template prop so as to check and measure the interactive action. For example, when a user plays a card, the augmented reality device sends identification information corresponding to the current card to the server terminal, and the server terminal can add a played mark to the suit and the number corresponding to the identification information or directly delete the played mark according to the received identification information.
In this embodiment, the server device allocates virtual content for the template prop according to the identification information of the template prop sent by the augmented reality device, and sends the virtual content to the augmented reality device, so that the augmented reality device can perform augmented reality presentation based on the template object in the real scene and the virtual content in the virtual scene, and the user can obtain a visual effect of virtual-real fusion, and can obtain a real sense of touch and a virtual sense of view simultaneously in the interaction process, thereby improving the sense of real participation and enriching the interactivity.
In the above or following embodiment, before step 401, the method further comprises:
sending an identification permission starting notice to the augmented reality equipment according to a preset rule;
when the augmented reality device detects that the template object is included in the reality scene to which the augmented reality device belongs, the identification information included in the template object in the reality scene to which the augmented reality device belongs is identified according to the interactive instruction.
The identification permission opening notification sent by the server can be a game starting notification, an interaction starting notification, a camera opening instruction and the like. For example, when the augmented reality device monitors an instruction sent by the server to turn on the AR camera, the camera is turned on, and once the camera is turned on, the augmented reality device automatically identifies the identification information of the template object in the field of view.
In this embodiment, the server controls the recognition authority of the augmented reality device according to the preset rule, so that the augmented reality device is prevented from executing invalid recognition operation. For example, in a floor-fighting game, according to a preset rule, all users cannot determine what face they will obtain before dealing, and only after the server sends a game start notification to each augmented reality device according to the preset rule, the augmented reality device performs identification of the identification information, and before that, the augmented reality device does not perform identification of the identification information.
Fig. 5 is an augmented reality interaction method according to another embodiment of the present application, and as shown in fig. 5, the method includes:
500. receiving identification information contained in a template object in a reality scene which the augmented reality device belongs to and sent by the augmented reality device;
501. distributing virtual content to the template object according to the identification information;
502. sending the virtual content to the augmented reality equipment so that the augmented reality equipment tracks and displays the virtual content at the mapping position of the template object in the virtual scene according to the position mapping relation between the real scene and the virtual scene, and displays the virtual content in a linkage manner along with the interaction action of the template object;
503. receiving a virtual content sharing notification sent by the augmented reality device;
504. and sending the sharable virtual content to other augmented reality equipment interacting with the augmented reality equipment for display according to the virtual content sharing notice.
For the description of steps 500-502, reference may be made to the above embodiments, which are not repeated herein.
In this embodiment, when the user communicates with each other, the server terminal may synchronize the scene picture and/or the voice data of the real scene sent by the terminal side to other augmented reality devices interacting with the augmented reality device. However, to ensure the privacy of the virtual content, not all of the virtual content may be presented to other augmented reality devices, for example, in a floor game, the cards that are not played should not be visible to other users except the user himself, so the server terminal will set the right to the other augmented reality devices to be invisible to the suits and numbers that are not played, and the suits and numbers that are not played will only be sent to the first augmented reality device, but not to the augmented reality devices of the other users.
And when the user operates the interactive control in the augmented reality equipment, the virtual content sharing notification is sent to the server, and the server sends the sharable virtual content to other augmented reality equipment interacting with the augmented reality equipment according to the received virtual content sharing notification to be displayed.
For example, in a landlord game, after a user of an augmented reality device plays two cards, the server can send virtual suits and numbers corresponding to the two cards to the augmented reality devices of other users for display, so that the two cards are shared and visible for all the users, and therefore the effect that multiple users are immersed in an interactive environment with the combination of virtual and real cards at the same time can be achieved, and the interactivity of the game is enhanced.
Fig. 6 is an augmented reality interaction method provided in another embodiment of the present application, and as shown in fig. 6, the method includes:
600. receiving identification information contained in a template object in a real scene to which the augmented reality device belongs, wherein the identification information is sent by the augmented reality device;
601. distributing virtual content to the template object according to the identification information;
602. receiving the contour feature of the template object sent by the augmented reality equipment;
603. generating a virtual object according to the outline feature and the virtual content;
604. and sending the virtual object to the augmented reality equipment so that the augmented reality equipment tracks and displays the virtual object at the mapping position of the template object in the virtual scene according to the position mapping relation between the real scene and the virtual scene, and performs linkage display on the virtual object along with the interaction action of the template object.
For the description of steps 600-601, 604, reference may be made to the above embodiments, which are not described herein again.
In this embodiment, in order to enrich the screen content in the virtual environment, the server terminal may generate a virtual object according to the received contour feature of the template object and the virtual content, in addition to allocating the virtual content to the template object. The virtual object may be pre-stored in the server device, for example, corresponding virtual content is determined according to the identification information, a corresponding 3D model is determined according to the contour feature, and then the virtual content and the 3D model are combined to generate the virtual object. For the personalized requirements, the virtual object may also be constructed by the server device in real time, for example, a 3D model is constructed according to the contour features, and the virtual content is added to the 3D model to generate the virtual object; other model elements may also be rendered in the 3D model to enrich the visual perception of the virtual object. Of course, other virtual object generation methods may also be used, and the present application is not limited to this specifically.
In this embodiment, the server device generates the virtual object according to the contour feature and the virtual content of the template object, and when the augmented reality device tracks and displays the virtual object on the template object, the user can visually obtain a sense of directly performing an interactive action on the virtual object in the virtual environment during the interactive action of the template object, which is beneficial to enhancing the virtual sense.
Fig. 7 is an augmented reality device according to an embodiment of the present application, as shown in fig. 7, the augmented reality device includes a memory and a processor,
the memory 70 is used to store computer programs and may be configured to store various other data to support operations on the augmented reality device. Examples of such data include instructions for any application or method operating on the terminal, contact data, phonebook data, messages, pictures, videos, etc.;
the memory 70 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
A processor 71, coupled to the memory 70, for executing computer programs in the memory for:
identifying identification information contained in a template object in a real scene to which augmented reality equipment belongs according to the interactive instruction;
sending the identification information to a server so that the server can distribute virtual content to the template object according to the identification information;
receiving the virtual content returned by the server, and tracking and displaying the virtual content at the mapping position of the template object in the virtual scene according to the position mapping relation between the real scene and the virtual scene;
and following the interactive action of the template object, performing linkage display on the virtual content.
In some embodiments, the processor 71, after receiving the virtual content returned by the server and displaying the virtual content at the mapping position of the template object in the virtual scene according to the position mapping relationship between the real scene and the virtual scene, is further configured to:
displaying an interactive interface, wherein the interactive interface at least comprises an interactive control, a template object in a real scene to which the augmented reality equipment belongs, virtual content in the virtual scene, and template objects in real scenes to which other augmented reality equipment interacting with the augmented reality equipment belongs;
and responding to the operation of the user on the interactive control, and sending a virtual content sharing notice to the server so that the server can send the sharable virtual content to other augmented reality equipment interacting with the augmented reality equipment according to the virtual content sharing notice for displaying.
In some embodiments, when identifying, according to the interactive instruction, the identification information included in the template object in the reality scene to which the augmented reality device belongs, the processor 71 is specifically configured to:
monitoring an identification permission opening notice sent by the server;
when the identification permission starting notice is monitored, detecting whether a real scene contains the template object;
if yes, identifying the identification information contained in the template object.
In some embodiments, when identifying, according to the interactive instruction, identification information included in a template object in a real scene to which the augmented reality device belongs, the processor 71 is specifically configured to:
acquiring position information of the template object in the real scene;
and displaying the virtual content at the mapping position of the template object in the virtual environment according to the position information of the template object in the real scene and the position mapping relation between the real scene and the virtual scene.
In some embodiments, processor 71 executes computer programs in memory 70 for:
acquiring the coordinate position of the identification information in a real scene;
determining the coordinate position of the identification information in the virtual scene according to the coordinate position of the identification information in the real scene and the position mapping relation between the real scene and the virtual scene;
overlaying the virtual content at the coordinate position of the identification information in the virtual scene.
In some embodiments, processor 71 executes computer programs in memory 70 for: before receiving the virtual content returned by the server,
collecting the contour features of the template object;
sending the outline characteristics of the template object to the server so that the server can generate a virtual object according to the outline characteristics and the virtual content;
the receiving the virtual content returned by the server comprises: and receiving the virtual object returned by the server.
In some embodiments, processor 71 executes computer programs in memory 70 for:
acquiring scene pictures and/or voice data in the real scene in real time;
and sending the scene picture and/or the voice data to the server so that the server synchronizes the scene picture and/or the voice data to other augmented reality equipment interacting with the augmented reality equipment.
In some embodiments, the template object is a non-electronic information carrier, and the identification information is a two-dimensional code image or a barcode image arranged on the template object; or
The template object is an electronic information carrier, and the identification information is the mac address or the IP address of the electronic information carrier.
Further, as shown in fig. 7, the augmented reality device further includes: communication components 72, power components 73, audio components 74, cameras 75, and the like. Only some of the components are schematically shown in fig. 6, and it is not meant that the augmented reality device includes only the components shown in fig. 6.
Wherein the communication component 72 is configured to facilitate wired or wireless communication between the device in which the communication component is located and other devices. The device in which the communication component is located may access a wireless network based on a communication standard, such as WiFi,2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In one exemplary embodiment, the communication component further comprises a Near Field Communication (NFC) module to facilitate short-range communication. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
The power supply unit 73 supplies power to various components of the device in which the power supply unit is located. The power components may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the device in which the power component is located.
The audio component 74 may be configured to output and/or input audio signals, among other things. For example, the audio component includes a Microphone (MIC) configured to receive an external audio signal when the device in which the audio component is located is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signal may further be stored in a memory or transmitted via a communication component. In some embodiments, the audio assembly further comprises a speaker for outputting audio signals.
The camera 75 may be configured to capture a scene picture and identify the identification information included in the template object.
Accordingly, the present application further provides a computer-readable storage medium storing a computer program, where the computer program is capable of implementing the steps that can be executed by the augmented reality device in the foregoing method embodiments when executed.
Fig. 8 is a server device according to an embodiment of the present application, and as shown in fig. 8, the server device includes a memory 80 and a processor 81
The memory 80 stores computer programs and may be configured to store various other data to support operations on the server device. Examples of such data include instructions for any application or method operating on the server device, contact data, phonebook data, messages, pictures, videos, and the like.
The memory 80 is implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
The processor 81 is coupled to the memory 80 for executing the computer program in the memory 80 for:
receiving identification information contained in a template object in a real scene sent by augmented reality equipment;
distributing virtual content to the template object according to the identification information;
and sending the virtual content to the augmented reality equipment so that the augmented reality equipment tracks and displays the virtual content at the mapping position of the template object in the virtual scene according to the position mapping relation between the real scene and the virtual scene, and performs linkage display on the virtual content along with the interaction action of the template object.
In some embodiments, processor 81, after sending the virtual content to the augmented reality device, is further configured to:
receiving a virtual content sharing notification sent by the augmented reality device;
and sending the sharable virtual content to other augmented reality equipment interacting with the augmented reality equipment for display according to the virtual content sharing notice.
In some embodiments, the processor 81, before receiving the identification information included in the template object in the real scene transmitted by the augmented reality device, is further configured to:
sending an identification permission starting notice to the augmented reality equipment according to a preset rule;
when the augmented reality device detects that the template object is included in the reality scene to which the augmented reality device belongs, the identification information included in the template object in the reality scene to which the augmented reality device belongs is identified according to the interactive instruction.
In some embodiments, the processor 81 executes computer programs in the memory 80 for:
before the virtual content is sent to the augmented reality equipment, receiving the contour feature of the template object sent by the augmented reality equipment;
generating a virtual object according to the outline feature and the virtual content;
the sending the virtual content to the augmented reality device includes: and sending the virtual object to the augmented reality equipment.
Further, as shown in fig. 8, the server apparatus further includes: communications component 82, display 83, power supply component 84, and the like. Only some of the components are schematically shown in fig. 8, and the server is not meant to include only the components shown in fig. 8.
Wherein the communication component 82 is configured to facilitate communication between the device in which the communication component is located and other devices in a wired or wireless manner. The device in which the communication component is located may access a wireless network based on a communication standard, such as WiFi,2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In one exemplary embodiment, the communication component further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
The display 83 includes a screen, which may include a Liquid Crystal Display (LCD) and a Touch Panel (TP), among others. If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation.
The power supply 84 provides power to various components of the device in which the power supply is located. The power components may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the device in which the power component is located.
Accordingly, the present application further provides a computer-readable storage medium storing a computer program, where the computer program can implement the steps that can be executed by the server device in the foregoing method embodiments when executed.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrases "comprising one of 8230; \8230;" 8230; "does not exclude the presence of additional like elements in a process, method, article, or apparatus that comprises that element.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (12)

1. An augmented reality interaction method is applicable to a first augmented reality device, and is characterized by comprising the following steps:
when an identification permission starting notification sent by a server based on a preset rule for avoiding the augmented reality device from executing invalid identification operation is monitored, identifying identification information contained in a template object in a real scene to which the first augmented reality device belongs according to an interactive instruction, wherein the identification information is used for uniquely identifying the template object, and different template objects correspond to different identification information;
sending the identification information to a server so that the server allocates virtual content to the template object according to the identification information, wherein the virtual content is information borne by the template object and is different from a 3D model of the template object in a virtual scene; wherein, the server allocating virtual content to the template object according to the identification information specifically includes: the server distributes virtual content to the template object from virtual content pre-stored by the server according to the identification information and the corresponding relation between the preset identification information and the virtual content; receiving the virtual content returned by the server, and displaying the virtual content at the mapping position of the template object in the virtual scene according to the position mapping relation between the real scene and the virtual scene;
the virtual content is displayed in a linkage manner along with the interaction action of the template object, and the display proportion of the virtual content is adjusted according to the moving position of the template object so as to adapt to the visual difference caused by the distance between the template object and the human eyes;
displaying an interactive interface, wherein the interactive interface at least comprises an interactive control, a template object in a real scene to which the first augmented reality device belongs, virtual content in the virtual scene, and template objects in real scenes to which other augmented reality devices interacting with the first augmented reality device belong;
responding to the operation of the user on the interaction control, and sending a virtual content sharing notification to the server so that the server can send shareable virtual content to other augmented reality equipment interacting with the first augmented reality equipment according to the virtual content sharing notification to be displayed;
the interactive interface comprises a self display area and an opposite display area, a template object in a real scene to which a user belongs and a picture fused with a virtual object in a virtual environment are displayed in the self display area, and the self display area also displays the interactive control; and the opposite side display area displays the character picture of the opposite side user, the template object in the real scene of the opposite side user or sharable virtual content corresponding to the opposite side user.
2. The augmented reality interaction method according to claim 1, wherein the identifying, according to the interaction instruction, the identification information included in the template object in the real scene to which the first augmented device belongs includes:
monitoring an identification permission opening notice sent by the server;
when the identification permission starting notice is monitored, detecting whether the template object is contained in a real scene or not;
and if so, identifying the identification information contained in the template object in the real scene to which the first augmentation equipment belongs.
3. The augmented reality interaction method of claim 1, wherein the displaying the virtual content at the mapped position of the template object in the virtual scene comprises:
acquiring position information of the template object in the real scene;
and displaying the virtual content at the mapping position of the template object in the virtual scene according to the position information of the template object in the real scene and the position mapping relation between the real scene and the virtual scene.
4. The augmented reality interaction method according to claim 3, wherein the displaying the virtual content at the mapping position of the template object in the virtual scene according to the position information of the template object in the real scene and the position mapping relationship between the real scene and the virtual scene comprises:
acquiring the coordinate position of the identification information in a real scene;
determining the coordinate position of the identification information in the virtual scene according to the coordinate position of the identification information in the real scene and the position mapping relation between the real scene and the virtual scene;
and displaying the virtual content at the coordinate position of the identification information in the virtual scene.
5. The augmented reality interaction method of claim 1, further comprising, before the step of receiving the virtual content returned by the server:
collecting the contour features of the template object;
sending the outline characteristics of the template object to the server so that the server can generate a virtual object according to the outline characteristics and the virtual content;
the receiving the virtual content returned by the server includes: and receiving the virtual object returned by the server.
6. The augmented reality interaction method of any one of claims 1-5, further comprising:
acquiring scene pictures and/or voice data in the real scene in real time;
and sending the scene picture and/or the voice data to the server so that the server synchronizes the scene picture and/or the voice data to other augmented reality equipment interacting with the augmented reality equipment.
7. The augmented reality interaction method according to any one of claims 1 to 5, wherein the template object is a non-electronic information carrier, and the identification information is a two-dimensional code image or a barcode image provided on the template object; or
The template object is an electronic information carrier, and the identification information is the mac address or the IP address of the electronic information carrier.
8. An augmented reality interaction method is applied to a server, and the method comprises the following steps:
receiving identification information contained in a template object in a real scene sent by augmented reality equipment, wherein the augmented reality equipment identifies the identification information contained in the template object in the real scene to which the augmented reality equipment belongs according to an interactive instruction when monitoring an identification authority starting notice sent by a server based on a preset rule for avoiding the augmented reality equipment from executing invalid identification operation, the identification information is used for uniquely identifying the template object, and different template objects correspond to different identification information;
distributing virtual content to the template object according to the identification information, wherein the virtual content is information borne by the template object, and the virtual content is different from a 3D model of the template object in a virtual scene; wherein, the allocating virtual content to the template object according to the identification information specifically includes: distributing virtual content to the template object from pre-stored virtual content according to the identification information and the corresponding relation between the preset identification information and the virtual content;
sending the virtual content to the augmented reality equipment, so that the augmented reality equipment displays the virtual content at the mapping position of the template object in the virtual scene according to the position mapping relation between the real scene and the virtual scene, and displays the virtual content in a linkage manner along with the interaction action of the template object, wherein the display proportion of the virtual content is adjusted according to the moving position of the template object so as to adapt to the visual difference caused by the fact that the template object is far away from the human eyes;
after the sending the virtual content to the augmented reality device, the method further includes:
receiving a virtual content sharing notification sent by the augmented reality device;
sending sharable virtual content to other augmented reality equipment interacting with the augmented reality equipment according to the virtual content sharing notice for displaying;
the augmented reality device is further configured to perform the steps of:
displaying an interactive interface, wherein the interactive interface at least comprises an interactive control, a template object in a reality scene to which the augmented reality device belongs, virtual content in the virtual scene, and a template object in a reality scene to which other augmented reality devices interacting with the augmented reality device belong;
the interactive interface comprises a self display area and an opposite display area, a template object in a real scene to which a user belongs and a picture fused with a virtual object in a virtual environment are displayed in the self display area, and the self display area also displays the interactive control; the opposite side display area displays figure pictures of the opposite side user, template objects in a real scene to which the opposite side user belongs or sharable virtual content corresponding to the opposite side user;
and responding to the operation of the user on the interaction control, and sending a virtual content sharing notification to the server.
9. The augmented reality interaction method according to claim 8, wherein before receiving the identification information included in the template object in the real scene sent by the augmented reality device, the method further comprises:
sending an identification permission starting notice to the augmented reality equipment according to a preset rule;
and when the augmented reality device detects that the template object is included in the reality scene to which the augmented reality device belongs, the augmented reality device starts the notification according to the identification authority, and identifies the identification information included in the template object in the reality scene to which the augmented reality device belongs.
10. The augmented reality interaction method of claim 8, further comprising, prior to the step of sending the virtual content to the augmented reality device:
receiving the contour feature of the template object sent by the augmented reality equipment;
generating a virtual object according to the outline feature and the virtual content;
the sending the virtual content to the augmented reality device includes: and sending the virtual object to the augmented reality equipment.
11. An augmented reality device comprising a memory and a processor,
the memory for storing a computer program;
the processor is configured to execute the computer program stored in the memory to:
when an identification permission starting notice sent by a server based on a preset rule for avoiding the augmented reality equipment from executing invalid identification operation is monitored, identifying identification information contained in a template object in a real scene to which the augmented reality equipment belongs according to an interactive instruction, wherein the identification information is used for uniquely identifying the template object, and different template objects correspond to different identification information;
sending the identification information to a server so that the server allocates virtual content to the template object according to the identification information, wherein the virtual content is information borne by the template object and is different from a 3D model of the template object in a virtual scene; wherein, the server allocating virtual content to the template object according to the identification information specifically includes: the server distributes virtual content to the template object from virtual content pre-stored by the server according to the identification information and the corresponding relation between the preset identification information and the virtual content;
receiving the virtual content returned by the server, and displaying the virtual content at the mapping position of the template object in the virtual scene according to the position mapping relation between the real scene and the virtual scene;
following the interactive action of the template object, performing linkage display on the virtual content; the display scale of the virtual content is adjusted according to the moving position of the template object so as to adapt to the visual difference caused by the distance between the template object and the human eyes;
displaying an interactive interface, wherein the interactive interface at least comprises an interactive control, a template object in a real scene to which the augmented reality equipment belongs, virtual content in the virtual scene, and template objects in real scenes to which other augmented reality equipment interacting with the augmented reality equipment belongs;
responding to the operation of the user on the interactive control, and sending a virtual content sharing notice to the server so that the server can send shareable virtual content to other augmented reality equipment interacting with the augmented reality equipment according to the virtual content sharing notice for displaying;
the interactive interface comprises a self display area and an opposite display area, a template object in a real scene to which a user belongs and a picture fused with a virtual object in a virtual environment are displayed in the self display area, and the self display area also displays the interactive control; and the opposite side display area displays the character picture of the opposite side user, the template object in the real scene to which the opposite side user belongs or sharable virtual content corresponding to the opposite side user.
12. A server device, comprising a memory and a processor,
the memory for storing a computer program;
the processor, configured to execute a computer program stored in the memory, to:
receiving identification information contained in a template object in a real scene sent by augmented reality equipment; when monitoring an identification authority starting notification sent by a server device based on a preset rule for avoiding the augmented reality device from executing invalid identification operation, the augmented reality device identifies identification information contained in a template object in a real scene to which the augmented reality device belongs according to an interactive instruction, wherein the identification information is used for uniquely identifying the template object, and different template objects correspond to different identification information;
distributing virtual content to the template object according to the identification information, wherein the virtual content is information borne by the template object, and the virtual content is different from a 3D model of the template object in a virtual scene; wherein, the allocating virtual content to the template object according to the identification information specifically includes: distributing virtual content to the template object from prestored virtual content according to the identification information and the corresponding relation between the preset identification information and the virtual content;
sending the virtual content to the augmented reality equipment so that the augmented reality equipment displays the virtual content at the mapping position of the template object in the virtual scene according to the position mapping relation between the real scene and the virtual scene, and displays the virtual content in a linkage manner along with the interaction action of the template object; the display scale of the virtual content is adjusted according to the moving position of the template object so as to adapt to the visual difference caused by the distance between the template object and the human eyes;
receiving a virtual content sharing notification sent by the augmented reality device; sending sharable virtual content to other augmented reality equipment interacting with the augmented reality equipment according to the virtual content sharing notice for displaying;
wherein the augmented reality device further performs the steps of:
displaying an interactive interface, wherein the interactive interface at least comprises an interactive control, a template object in a reality scene to which the augmented reality device belongs, virtual content in the virtual scene, and a template object in a reality scene to which other augmented reality devices interacting with the augmented reality device belong; responding to the operation of the user on the interaction control, and sending a virtual content sharing notification to the server equipment; the interactive interface comprises a self display area and an opposite display area, a template object in a real scene to which a user belongs and a picture fused with a virtual object in a virtual environment are displayed in the self display area, and the self display area also displays the interactive control; and the opposite side display area displays the character picture of the opposite side user, the template object in the real scene to which the opposite side user belongs or sharable virtual content corresponding to the opposite side user.
CN201810036004.3A 2018-01-15 2018-01-15 Augmented reality interaction method and equipment Active CN108269307B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810036004.3A CN108269307B (en) 2018-01-15 2018-01-15 Augmented reality interaction method and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810036004.3A CN108269307B (en) 2018-01-15 2018-01-15 Augmented reality interaction method and equipment

Publications (2)

Publication Number Publication Date
CN108269307A CN108269307A (en) 2018-07-10
CN108269307B true CN108269307B (en) 2023-04-07

Family

ID=62775688

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810036004.3A Active CN108269307B (en) 2018-01-15 2018-01-15 Augmented reality interaction method and equipment

Country Status (1)

Country Link
CN (1) CN108269307B (en)

Families Citing this family (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110858134B (en) * 2018-08-22 2023-04-28 阿里巴巴集团控股有限公司 Data, display processing method and device, electronic equipment and storage medium
CN108874156A (en) * 2018-08-30 2018-11-23 合肥虹慧达科技有限公司 Augmented reality interactive system and its application method
CN109272778B (en) * 2018-10-22 2020-07-24 广东精标科技股份有限公司 Intelligent teaching system with AR function
CN110147185B (en) * 2018-11-16 2021-02-26 腾讯科技(深圳)有限公司 Message prompting method, device, electronic device and storage medium
CN109782910B (en) * 2018-12-29 2021-04-06 北京诺亦腾科技有限公司 VR scene interaction method and device
CN111459263B (en) * 2019-01-21 2023-11-03 广东虚拟现实科技有限公司 Virtual content display method and device, terminal equipment and storage medium
CN109903129A (en) * 2019-02-18 2019-06-18 北京三快在线科技有限公司 Augmented reality display methods and device, electronic equipment, storage medium
CN111626803A (en) * 2019-02-28 2020-09-04 北京京东尚科信息技术有限公司 Method and device for customizing article virtualization and storage medium thereof
CN110084979B (en) * 2019-04-23 2022-05-10 暗物智能科技(广州)有限公司 Human-computer interaction method and device, controller and interaction equipment
WO2020219109A1 (en) * 2019-04-26 2020-10-29 Google Llc System and method for creating persistent mappings in augmented reality
CN111913560B (en) * 2019-05-07 2024-07-02 广东虚拟现实科技有限公司 Virtual content display method, device, system, terminal equipment and storage medium
CN110321002A (en) * 2019-05-09 2019-10-11 深圳报业集团控股公司 A kind of scene interaction systems and exchange method
CN112055033B (en) * 2019-06-05 2022-03-29 北京外号信息技术有限公司 Interaction method and system based on optical communication device
CN112055034B (en) * 2019-06-05 2022-03-29 北京外号信息技术有限公司 Interaction method and system based on optical communication device
CN110197532A (en) * 2019-06-05 2019-09-03 北京悉见科技有限公司 System, method, apparatus and the computer storage medium of augmented reality meeting-place arrangement
CN112565165B (en) * 2019-09-26 2022-03-29 北京外号信息技术有限公司 Interaction method and system based on optical communication device
CN110418127B (en) * 2019-07-29 2021-05-11 南京师范大学 Operation method of pixel template-based virtual-real fusion device in Web environment
CN112399125B (en) * 2019-08-19 2022-06-10 中国移动通信集团广东有限公司 Remote assistance method, device and system
CN112446799B (en) * 2019-09-03 2024-03-19 全球能源互联网研究院有限公司 Power grid dispatching method and system based on AR equipment virtual interaction
CN110720982B (en) * 2019-10-29 2021-08-06 京东方科技集团股份有限公司 Augmented reality system, control method and device based on augmented reality
CN110865708B (en) * 2019-11-14 2024-03-15 杭州网易云音乐科技有限公司 Interaction method, medium, device and computing equipment of virtual content carrier
TWI744737B (en) * 2019-12-11 2021-11-01 中華電信股份有限公司 System and method for content control in augmented reality and computer readable storage medium
CN111274910B (en) * 2020-01-16 2024-01-30 腾讯科技(深圳)有限公司 Scene interaction method and device and electronic equipment
CN111538407A (en) * 2020-03-13 2020-08-14 讯飞幻境(北京)科技有限公司 Information interaction method and device and electronic equipment
AU2021241770A1 (en) * 2020-03-23 2022-10-13 Mentar Holding AG Device and method for providing augmented reality content
CN112785716A (en) * 2020-04-07 2021-05-11 江南造船(集团)有限责任公司 Augmented reality construction guiding method, device, terminal and medium
CN111724484B (en) * 2020-06-10 2021-02-09 深圳市金研微科技有限公司 Augmented reality information interaction system and interaction method
CN111787080B (en) * 2020-06-21 2021-01-29 广东友易互联科技有限公司 Data processing method based on artificial intelligence and Internet of things interaction and cloud computing platform
CN111744180B (en) * 2020-06-29 2024-09-17 完美世界(重庆)互动科技有限公司 Method and device for loading virtual game, storage medium and electronic device
CN112561994A (en) * 2020-12-07 2021-03-26 高炼 Scene fusion positioning system and method based on virtual reality technology
CN112581630B (en) * 2020-12-08 2024-06-21 北京移目科技有限公司 User interaction method and system
CN112650390A (en) * 2020-12-22 2021-04-13 科大讯飞股份有限公司 Input method, related device and input system
CN113126770A (en) * 2021-04-30 2021-07-16 塔普翊海(上海)智能科技有限公司 Interactive three-dimensional scenery system based on augmented reality
CN113377205B (en) * 2021-07-06 2022-11-11 浙江商汤科技开发有限公司 Scene display method and device, equipment, vehicle and computer readable storage medium
CN113676456A (en) * 2021-07-22 2021-11-19 北京汉云信通技术有限公司 VOLTE-based video call method, device and system
CN113867528A (en) * 2021-09-27 2021-12-31 北京市商汤科技开发有限公司 Display method, device, equipment and computer readable storage medium
CN114020355B (en) * 2021-11-01 2024-01-30 上海米哈游天命科技有限公司 Object loading method and device based on cache space
CN114661164A (en) * 2022-04-08 2022-06-24 冠捷显示科技(厦门)有限公司 Meta-universe technology-based display equipment correlation method
CN116955835B (en) * 2023-09-21 2023-12-22 腾讯科技(深圳)有限公司 Resource screening method, device, computer equipment and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102323985A (en) * 2011-09-08 2012-01-18 盛乐信息技术(上海)有限公司 Real and virtuality conversion system and method
CN105844714A (en) * 2016-04-12 2016-08-10 广州凡拓数字创意科技股份有限公司 Augmented reality based scenario display method and system

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
MX2012010238A (en) * 2010-03-05 2013-01-18 Sony Comp Entertainment Us Maintaining multiple views on a shared stable virtual space.
US9047698B2 (en) * 2011-03-29 2015-06-02 Qualcomm Incorporated System for the rendering of shared digital interfaces relative to each user's point of view
US9122321B2 (en) * 2012-05-04 2015-09-01 Microsoft Technology Licensing, Llc Collaboration environment using see through displays
US10223859B2 (en) * 2012-10-30 2019-03-05 Bally Gaming, Inc. Augmented reality gaming eyewear
CN103366610B (en) * 2013-07-03 2015-07-22 央数文化(上海)股份有限公司 Augmented-reality-based three-dimensional interactive learning system and method
CN103561065B (en) * 2013-10-22 2017-05-24 深圳市优逸电子科技有限公司 System and method for achieving 3D virtual advertisement with mobile terminal
CN104102412B (en) * 2014-07-24 2017-12-12 央数文化(上海)股份有限公司 A kind of hand-held reading device and method thereof based on augmented reality
US9898864B2 (en) * 2015-05-28 2018-02-20 Microsoft Technology Licensing, Llc Shared tactile interaction and user safety in shared space multi-person immersive virtual reality

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102323985A (en) * 2011-09-08 2012-01-18 盛乐信息技术(上海)有限公司 Real and virtuality conversion system and method
CN105844714A (en) * 2016-04-12 2016-08-10 广州凡拓数字创意科技股份有限公司 Augmented reality based scenario display method and system

Also Published As

Publication number Publication date
CN108269307A (en) 2018-07-10

Similar Documents

Publication Publication Date Title
CN108269307B (en) Augmented reality interaction method and equipment
EP3615156B1 (en) Intuitive augmented reality collaboration on visual data
CN111052043B (en) Controlling external devices using a real-world interface
US10013805B2 (en) Control of enhanced communication between remote participants using augmented and virtual reality
CN108038726B (en) Article display method and device
US10692113B2 (en) Method for providing customized information through advertising in simulation environment, and associated simulation system
CN111970456B (en) Shooting control method, device, equipment and storage medium
US10521603B2 (en) Virtual reality system for providing secured information
US11979684B2 (en) Content distribution device, content distribution program, content distribution method, content display device, content display program, and content display method
CN110851095B (en) Multi-screen interactions in virtual and augmented reality
CN106873886B (en) Control method and device for stereoscopic display and electronic equipment
CN106774849B (en) Virtual reality equipment control method and device
CN106648038A (en) Method and apparatus for displaying interactive object in virtual reality
CN110751707B (en) Animation display method, animation display device, electronic equipment and storage medium
CN108401463A (en) Virtual display device, intelligent interaction method and cloud server
CN109496293A (en) Extend content display method, device, system and storage medium
KR20240072170A (en) User interactions with remote devices
WO2016095422A1 (en) Glasses, display terminal and image display processing system and method
CN105528081B (en) Mixed reality display method, device and system
CN114442814B (en) Cloud desktop display method, device, equipment and storage medium
CN103752010B (en) For the augmented reality covering of control device
US20230260235A1 (en) Information processing apparatus, information processing method, and information processing system
CN112774185A (en) Virtual card control method, device and equipment in card virtual scene
CN111782053B (en) Model editing method, device, equipment and storage medium
CN114546188B (en) Interaction method, device and equipment based on interaction interface and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
CB03 Change of inventor or designer information
CB03 Change of inventor or designer information

Inventor after: Yin Zuoshui

Inventor after: Jiang Bin

Inventor after: Chi Xiaoyu

Inventor before: Yin Zuoshui

TA01 Transfer of patent application right

Effective date of registration: 20201029

Address after: 261061 north of Yuqing East Street, east of Dongming Road, Weifang High tech Zone, Weifang City, Shandong Province (Room 502, Geer electronic office building)

Applicant after: GoerTek Optical Technology Co.,Ltd.

Address before: 266104 Laoshan Qingdao District North House Street investment service center room, Room 308, Shandong

Applicant before: GOERTEK TECHNOLOGY Co.,Ltd.

TA01 Transfer of patent application right
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 261061 east of Dongming Road, north of Yuqing East Street, high tech Zone, Weifang City, Shandong Province (Room 502, Geer electronics office building)

Applicant after: GoerTek Optical Technology Co.,Ltd.

Address before: 261061 east of Dongming Road, north of Yuqing East Street, Weifang High tech Zone, Weifang City, Shandong Province (Room 502, Geer electronics office building)

Applicant before: GoerTek Optical Technology Co.,Ltd.

TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20221115

Address after: 266104 Room 308, North Investment Street Service Center, Laoshan District, Qingdao, Shandong.

Applicant after: GOERTEK TECHNOLOGY Co.,Ltd.

Address before: 261061 east of Dongming Road, north of Yuqing East Street, high tech Zone, Weifang City, Shandong Province (Room 502, Geer electronics office building)

Applicant before: GoerTek Optical Technology Co.,Ltd.

GR01 Patent grant
GR01 Patent grant