CN108762482A - Data interactive method and system between a kind of large screen and augmented reality glasses - Google Patents
Data interactive method and system between a kind of large screen and augmented reality glasses Download PDFInfo
- Publication number
- CN108762482A CN108762482A CN201810338583.7A CN201810338583A CN108762482A CN 108762482 A CN108762482 A CN 108762482A CN 201810338583 A CN201810338583 A CN 201810338583A CN 108762482 A CN108762482 A CN 108762482A
- Authority
- CN
- China
- Prior art keywords
- large screen
- glasses
- data
- augmented reality
- interaction
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Information Transfer Between Computers (AREA)
Abstract
The invention discloses data interactive method and systems between a kind of large screen and augmented reality glasses, and network communication is established between augmented reality glasses and large screen;Using the traffic model at page end-server-side-glasses end into row data communication;Server-side is initialized first and obtains IP address and port numbers, completes forwarding function or other data processing functions;Then large screen and augmented reality glasses are initialized;Setting network communication needs the url accessed, and completes basic communication process function;One client sends communication information when triggering alternative events, is forwarded via server-side, the other end is dealt with according to communication information.This method and system can organically combine large screen technology and augmented reality optometric technology, so that interaction technique more meets user demand, with regioselective that is convenient, being efficiently completed large screen, realize the control to view in large screen, and provide more three-dimensional informations to the user so that data representation is more lively.
Description
Technical field
The present invention relates to data and visual analysis and field of human-computer interaction, and in particular to a kind of large screen and augmented reality eye
Data interactive method and system between mirror.
Background technology
With the arrival of the development and big data epoch of multimedia technology, information of the virtual interactive interface technology in all trades and professions
Play the role of ever more important in displaying.In the big data epoch, since conventional display apparatus is limited by resolution ratio, it is difficult to
Meet visual presentation demand of the people to large-scale dataset.
Large screen (big screen) technology (is also known as large screen display technique, large screen refers to direct viewing type colour TV or rear-projection
Large screen in formula projection TV.In general, the Diagonal Dimension of screen is mostly at 40 inches or more) have area big, high brightness,
The characteristics such as high-resolution can be that user brings the experience of ultra high-definition, and establishment, manipulation, exploration, mark is supported more to regard
Figure, more suitable for the visual presentation demand to large-scale dataset.However, for how to realize remotely with large-size screen monitors interlude data
There is also many challenges, traditional mouse and keyboards to be difficult to meet the interaction demand of people and large screen for this interactive problem.
In educational circles, generally have for the above-mentioned remotely application scenarios with large-size screen monitors interlude data interaction, common solution:
1, the interbehavior of people is detected to obtain data interaction information based on sensor and tracing and positioning equipment,
2, it is based on (using) Intelligent mobile equipment and carries out network communication to obtain data interaction information.
With the development of technology, (MR, is the further development of virtual reality technology to mixed reality technology, which passes through
Virtual scene information is presented in reality scene, the letter of an interaction feedback is set up between real world, virtual world and user
Circuit is ceased, to enhance the sense of reality of user experience) and headset equipment technology growing, such as Microsoft
HoloLens (the first holographic computer equipment not limited by cable of Microsoft, can allow user to be interacted with digital content, and with surrounding
Hologram in true environment is interactive) etc. augmented realities glasses device, it is integrated stare, the functions such as gesture can be visual
Change interaction to offer convenience, augmented reality glasses device has outstanding hologram display ability, can be by the information except screen more
Vivo it is supplied to user.How augmented reality optometric technology and large screen to be combined in two angles of enhancing and interaction, is
User provides virtual interactive interface more rich, true to nature, is the constructive problems of a novelty and rich challenge.
Invention content
In view of the deficiencies in the prior art, the purpose of the present invention is to provide a kind of large screens and augmented reality glasses
Between data interactive method and system, for large-screen interactive in terms of existing challenge, utilize augmented reality optometric technology and big
Screen carries out data interaction, and it is interactive to solve the problems, such as that large screen is difficult to so that people can be direct by wearing augmented reality glasses
The data on large screen are operated, and generate more three-dimensional views, detailed information is more vivo supplied to user.
To achieve the above object, the technical solution adopted by the present invention is as follows:
Data interactive method between a kind of large screen and augmented reality glasses, includes the following steps:
Network communication is established between augmented reality glasses and large screen;
The augmented reality glasses need to support network communicating function, and the built-in CPU with certain calculation processing ability
With GPU units;
The network communication using the traffic model at page end-server-side-glasses end into row data communication, and page end-
The two-way communication of glasses end;
Server-side is initialized first, obtain server-side IP address and port numbers are set, and completes forwarding function or other data
Handle function;
Then large screen and augmented reality glasses are initialized respectively as two clients;
It needs the url accessed to be set as address above mentioned and port numbers network communication, and completes basic communication process function;
Message is defined according to each function respectively in page end and glasses end and sends function and call back function so that wherein one
A client can send communication information when triggering alternative events, be forwarded via server-side, and the other end receives message instruction simultaneously
Make corresponding processing.
Further, data interactive method between a kind of large screen and augmented reality glasses as described above, into row data communication
When, when communication data is small, server-side only needs the forwarding information between two clients;When communication data scale is inclined
When big, it is contemplated that the storage of glasses device and computing capability, server-side the data from page end can be handled after again
It is transmitted to glasses end.
Further, data interactive method between a kind of large screen and augmented reality glasses as described above, will use tradition side
The visualization webpage works that method is realized are as the page end shown on large screen;
A rear end program for being different from page end and glasses end is founded, possesses and forwards data between multiple client
Function.
Further, data interactive method between a kind of large screen and augmented reality glasses as described above is logical establishing network
When letter, total evaluation is carried out to visualization works and its augmented reality demand, design needs the function of being interacted using glasses,
And each message interface is designed according to interactive type and data.
Further, data interactive method between a kind of large screen and augmented reality glasses as described above, it is described to need to utilize
The function that glasses interact includes:
It allows user to see the dummy object being placed in space by augmented reality glasses, and then is interacted with dummy object;
The control assembly in webpage works is visualized, virtual controlling component is converted to, is presented by augmented reality glasses empty
Quasi- control assembly, so with virtual controlling component interaction;
By stare and/or gesture realize with the virtual controlling component interaction in augmented reality glasses, and these are interacted
The data of generation pass to page end by network communication, to control the content shown in the page on large screen.
Further, data interactive method between a kind of large screen and augmented reality glasses as described above, it is described to need to utilize
The function that glasses interact includes:
One dummy object is set in augmented reality glasses, object is substituted as virtual, it can for represent page end
Depending on changing some visual pattern or object in webpage works,
User interacts operation in glasses to virtually substituting object, and the result data for operating generation passes to webpage
End, to which people to be embodied in using the result that glasses interact on content of pages shown in large screen;
The interactive operation includes clicking, pull, rotate and scaling.
Further, data interactive method between a kind of large screen and augmented reality glasses as described above, first, by staring
Activation virtual controlling component is chosen,
Then, the interactive operation corresponding to the various gestures of people is defined,
Gesture duration that includes basic gesture and the basic gesture in user actually uses, position, in angle
Change information,
Operation data information caused by interaction is converted into the message format customized and is sent to large screen end,
The page end of large screen generates corresponding interaction effect according to message content.
Further, data interactive method between a kind of large screen and augmented reality glasses as described above, interacts operation
When, according to the type of interactive object, if it is the variation of attribute or state value, then record the numerical value that these operations generate;If
It is displacement distance, then actual operating result numerical value is calculated according to the space displacement of interactive meaning and gesture.
Further, data interactive method between a kind of large screen and augmented reality glasses as described above, glasses end, which is sent, to disappear
When breath, a message object message is established, including communicating pair identifier, interface name and specific message data, specifically
The attribute of message data is used for transmission operation data information caused by interaction;
After page end processing receives message, content is parsed, takes out interaction data, and corresponding interaction is generated according to function
Effect.
Further, data interactive method between a kind of large screen and augmented reality glasses as described above, it is described to need to utilize
The function that glasses interact includes:
By the way of the direct positioning interaction of large screen, user's utilization is stared and gesture operation, on large screen and specific
Position interact, wherein:
Cursor is replaced using staring, gesture replaces right and left key, realizes simulation mouse action,
A fitting large screen surface, size shape and large screen phase are generated using the position of large screen, size and shape
Same " virtual surface ", virtual surface are used to, as the subject that can be interacted in glasses, simulate necessary being in space
Large screen desktop, virtual surface is additionally operable to the desktop of simulative display cursor,
User is equal to the operation to large screen same position to the staring of virtual surface, gesture operation,
The spatial position of user's fixation point is obtained using virtual surface, and calculates fixation point using the length and width of virtual surface
The coordinate of apparent surface, so obtain fixation point page end relative coordinate.
Data interaction system between a kind of large screen and augmented reality glasses, including:
Augmented reality glasses as glasses end there is network communication module, built-in CPU and GPU units to have initialization
Module glasses end is initialized as a client,
Large screen, for showing that the visualization webpage works realized using conventional method, the works are used as in large screen
The page end of upper displaying has initialization module page end is initialized as another client,
Message transmission module and readjustment processing module, are respectively arranged on page end and glasses end, for the two-way between the two
Letter, wherein:
Message transmission module, for establishing a message object message, including communicating pair identifier, interface name,
It is used for transmission operation data information caused by interaction with the attribute of specific message data, specific message data,
Processing module is adjusted back, after receiving message for processing, content is parsed, takes out interaction data, and produce according to function
Raw corresponding interaction effect,
Server-side has Communications Processor Module, in the traffic model at page end-server-side-glasses end, forwarding net
Communication information between page end and glasses end,
Server-side initialization module, for obtaining server-side IP address and port numbers be arranged, complete forwarding function or other
Data processing function,
One rear module for being different from page end and glasses end, for forwarding data between multiple client.
Further, data interaction system between a kind of large screen and augmented reality glasses as described above, when communication data is advised
When mould is little, server-side only needs the forwarding information between two clients;When communication data scale is bigger than normal, it is contemplated that glasses
The storage of equipment and computing capability, server-side relay to glasses end after can handling the data from page end.
Further, data interaction system between a kind of large screen and augmented reality glasses as described above, it is described to be visually turned into
After product and its augmented reality demand carry out total evaluation, design needs the function of being interacted using glasses, and according to interactive
Type and data design each message interface.
Further, data interaction system between a kind of large screen and augmented reality glasses as described above, it is described to need to utilize
The function that glasses interact includes:
It allows user to see the dummy object being placed in space by augmented reality glasses, and then is interacted with dummy object;
The control assembly in webpage works is visualized, virtual controlling component is converted to, is presented by augmented reality glasses empty
Quasi- control assembly, so with virtual controlling component interaction;
By stare and/or gesture realize with the virtual controlling component interaction in augmented reality glasses, and these are interacted
The data of generation pass to page end by network communication, to control the content shown in the page on large screen.
Further, data interaction system between a kind of large screen and augmented reality glasses as described above, it is described to need to utilize
The function that glasses interact includes:
One dummy object is set in augmented reality glasses, object is substituted as virtual, it can for represent page end
Depending on changing some visual pattern or object in webpage works,
User interacts operation in glasses to virtually substituting object, and the result data for operating generation passes to webpage
End, to which people to be embodied in using the result that glasses interact on content of pages shown in large screen;
The interactive operation includes clicking, pull, rotate and scaling.
Further, data interaction system between a kind of large screen and augmented reality glasses as described above, first, by staring
Activation virtual controlling component is chosen,
Then, the interactive operation corresponding to the various gestures of people is defined,
Gesture duration that includes basic gesture and the basic gesture in user actually uses, position, in angle
Change information,
Operation data information caused by interaction is converted into the message format customized and is sent to large screen end,
The page end of large screen generates corresponding interaction effect according to message content.
Further, data interaction system between a kind of large screen and augmented reality glasses as described above, interacts operation
When, according to the type of interactive object, if it is the variation of attribute or state value, then record the numerical value that these operations generate;If
It is displacement distance, then actual operating result numerical value is calculated according to the space displacement of interactive meaning and gesture.
Further, data interaction system between a kind of large screen and augmented reality glasses as described above, it is described to need to utilize
The function that glasses interact includes:
By the way of the direct positioning interaction of large screen, user's utilization is stared and gesture operation, on large screen and specific
Position interact, wherein:
Cursor is replaced using staring, gesture replaces right and left key, realizes simulation mouse action,
A fitting large screen surface, size shape and large screen phase are generated using the position of large screen, size and shape
Same " virtual surface ", virtual surface are used to, as the subject that can be interacted in glasses, simulate necessary being in space
Large screen desktop, virtual surface is additionally operable to the desktop of simulative display cursor,
User is equal to the operation to large screen same position to the staring of virtual surface, gesture operation,
The spatial position of user's fixation point is obtained using virtual surface, and calculates fixation point using the length and width of virtual surface
The coordinate of apparent surface, so obtain fixation point page end relative coordinate.
The beneficial effects of the present invention are:Large screen technology and augmented reality optometric technology are organically combined, made
It obtains interaction technique and more meets user demand, data interaction is carried out using augmented reality optometric technology and large screen, by detailed information
It is more vivo supplied to user, it is interactive to solve the problems, such as that large screen is difficult to, is realized between large screen and augmented reality glasses
Interaction, wherein:
On the one hand, it is operated using stare function, gesture function etc. of augmented reality glasses, with convenient, efficient complete
At the regioselective of large screen, and the control to view in large screen, it is interactive to solve the problems, such as that large screen is difficult to;
On the other hand, it is combined by large screen and augmented reality glasses, generates the interactive environment of immersion, it is existing using enhancing
The hologram display ability of real glasses, user is more vivo supplied to by three-dimensional information.
The present invention makes full use of the respective advantage of augmented reality glasses and large screen, by network augmented reality glasses
It is combined together with large screen, creates the interactive environment of immersion.By create virtual controlling component and to large screen it is direct
Positioning interaction cleverly solves the problems, such as to be difficult to and large-screen interactive.In use, it can ensure user while observe
High-resolution displaying content on to large screen, and it can be seen that the holographic object of glasses end virtually;It is naturally handed over as a kind of
Mutual mode can also interact while controlling the behavior of large screen top view with the object placed in space, give user
Good immersion interactive experience is provided, user is contributed to deepen the impression to data.
The present invention can organically combine large screen technology and augmented reality optometric technology so that interaction technique is more
Meet user demand, with regioselective that is convenient, being efficiently completed large screen, realizes the control to view in large screen, and be
User provides more three-dimensional informations so that data representation is more lively.
Description of the drawings
Fig. 1 is data interactive method between a kind of large screen provided in the specific embodiment of the invention and augmented reality glasses
Flow chart.
Fig. 2 is that the present invention utilizes the identifiable gesture of augmented reality glasses:Air-tap gestures.
Fig. 3 a to 3c are three flows of augmented reality eyeglass design and large-screen interactive.
Fig. 3 a exemplary steps 1 (step1),
Fig. 3 b exemplary steps 2 (step2),
Fig. 3 c exemplary steps 3 (step3).
Fig. 4 a to 4b are coordinate map operation of the augmented reality glasses in positioning interaction direct to large screen.
Virtual board represent virtual surface in Fig. 4 a, and TDW is large screen, and screenheight is that large screen is high,
Screenwidth is that large screen is wide,
ScreenPosition indicates that virtual surface centre coordinate, HitPosition expressions stare position coordinates in Fig. 4 b,
RelativePosition indicates position vector.
Fig. 5 is flow chart of the augmented reality glasses in positioning interaction direct to large screen.
Fig. 6 is the augmented reality optometric technology and large-screen interactive technology concrete scene schematic diagram for scheming visualization.
Data interaction system between a kind of large screen and augmented reality glasses that are provided in Fig. 7 specific embodiment of the invention
Structure diagram.
Specific implementation mode
The present invention is described in further detail with specific implementation mode with reference to the accompanying drawings of the specification.
Fig. 1 shows data interaction between a kind of large screen provided in the specific embodiment of the invention and augmented reality glasses
The flow chart of method, this method include mainly:
(1) traffic model at page end-server-side-glasses end is established:
First, the visualization webpage works (including several pages) that conventional method is realized will be used to be opened up as on large screen
The webpage client (hereinafter referred to as page end) shown, and found one and be different from page end and augmented reality glasses client
The rear end program of (referring to augmented reality glasses, hereinafter referred to as glasses end), possesses and forwards data between multiple client
Function.Page end and server-side are established into network linking.
Then, augmented reality glasses need to support network communicating function, and the built-in CPU with certain calculation processing ability
With GPU units, it is Microsoft HoloLens that equipment is used in of the invention, and the augmented reality glasses include but not limited to
This.
After again, glasses end can pass through network communication (carrying out data transmission by wireless signal (such as WiFi)) and page end
Connection, glasses end relies on the communication information between above-mentioned server-side two clients of forwarding (referring to page end and glasses end), to real
Existing augmented reality glasses and large screen realize data interchange, and determine messaging protocol (the common network such as HTTP, WebSocket
Communication protocol).
Finally, above-mentioned augmented reality glasses are in view of communication data scale and hind computation ability, it is proposed that a net
The traffic model at page end-server-side-glasses end.When communication data is small, server-side only need two clients it
Between forwarding information;When communication data scale is bigger than normal, it is contemplated that the storage of glasses device and computing capability, server-side can be to coming
Glasses end is relayed to after being handled from the data of page end.The specific judgement bigger than normal of communication data scale can be according to reality
Situation determines, or empirically value is default.
(2) network communication is established between augmented reality glasses and large screen:
Design of Network Communication and realization include the following steps:
(1) total evaluation, design.
Total evaluation is carried out to a visualization works and its augmented reality demand first, design needs to carry out using glasses
The function of interaction (referring to data interaction), and each message interface is designed according to interactive type and data.
Message interface can use following json formats { " from ":,"to":,"name":,"data":
{"payload1":,"payload2":[{ }, { } ...], wherein name indicates that the title of data-interface, data indicate negative
It carries, wherein payload indicates that the numerical value such as data or control parameter, from and to are device identifier, and expression is having more equipment
It needs clearly to specify when communication to carry out source and target.For example, as certain in using glasses end control large screen is locative
When parameter value variations are 10, following message can be sent from glasses end to Web:{"from":"ARGlass","to":"web",
"name":"ChangePosition","data":{"value":10 } }, wherein glasses end is identified as ARGlass, page end
It is identified as web, the data of the entitled ChangePosition of interface, transmission are data, wherein containing there are one entitled value
Variable, it is 10 to be worth.
(2) it initializes.
Server-side is initialized first, obtain server-side IP address and port numbers are set, and completes forwarding function or other data
Handle function.Then large screen and augmented reality glasses are initialized respectively as two clients (referring to page end and glasses end),
It needs the url accessed to be set as address above mentioned (server-side IP address) and port numbers network communication, and completes basic foundation, connects
Sending and receiving such as send, interrupt at the communication process function.
(3) alternative events response and alternative events processing.
Message is defined according to each function respectively in page end and glasses end and sends function and call back function so that wherein one
A client can send communication information (alternative events response) when triggering alternative events, be forwarded via server-side, the other end
Message is received to instruct and make corresponding processing (alternative events processing).
(3) data interactive mode is (true according to page end visualization requirement on large screen between large screen and augmented reality glasses
It is fixed to need the specific interaction customized in augmented reality glasses):
(1) pass through virtual controlling component interaction:
Described in this system refers to by virtual controlling component interaction:By one on the page (page that page end is presented)
Part control assembly is individually taken out, and is realized in a manner of dummy object (i.e. virtual controlling component) in glasses end, allows user
Without by the widget interaction on mouse-keyboard and large screen, and by staring, gesture etc. and the virtual control seen in glasses end
Part interacts, and the data that these interactions generate are passed to page end by network communication, to control on large screen in the page
The content of display.
Augmented reality glasses can allow user to see the dummy object being placed in space (i.e. virtual controlling component), Yong Huye
Can by staring, the modes such as gesture interact with dummy object.Meanwhile (referring to visualization webpage to make in original visual page
Product) in have a many control assemblies, such as button, slider bar, calendar selection, control panel etc., these control assemblies enable a user to
Enough control the element on the page.
(2) by virtually substituting object interaction:In addition to traditional interactive mode, present invention firstly provides " virtual substitutes
The concept of body control ".
The concept of " virtually substitute object (Virtual Proxy Object) " described in this system refers to:It is existing in enhancing
One dummy object is set in real glasses, some visual pattern or object represented in original visual page (is virtually replaced
For object), user interacts operation in glasses to virtually substituting object, such as clicks, and pulls, rotation, scaling etc..These behaviour
The result data (such as position is moved, size variation etc.) for making to generate can pass to page end, to which people is utilized glasses interaction
As a result it is embodied on content of pages shown in large screen.
For the corresponding interactive mode of content in above-mentioned (1), (2) and large screen, this system can be supported at glasses end
In procedural item customization for page end virtual controlling component or virtually substitute object so that user can be with after wearing spectacles
Expected interaction demand is realized by these virtual controlling components or the virtual object that substitutes.
Customization is specifically divided into three classes for the virtual controlling component of page end in the procedural item at glasses end:
1) when required interaction is time control, a calendar object can be drawn in the application of glasses, the date,
Month, time can interactively select to change, and after user selects a time, date data can pass to page end, net
The content of page end displaying can also be switched to the corresponding date.
2) when required interaction is property control, can corresponding control assembly be set according to function and effect.If you need to
Virtual push button can be set between certain states when switching state in glasses, it can be with when needing continuously to change element property values
Virtual slider is set in glasses.In glasses, user is produced by clicking virtual push button or pulling the operation of virtual slider
Raw data simultaneously pass to page end, and the element of page end displaying will produce corresponding variation.As the page of large screen is visualized as
Then virtual slider can be arranged in glasses end in figure visualization system, for changing repulsion and gravitation, change between node of graph
The interaction such as node size, to change figure layout.
3) it when required interaction controls for virtual replacement object, can be replaced according to needed for former substantial shape determination
For the shape of object, user can also pass to page end, to control object in the page to these virtual operations for substituting object
Performance:As large screen visual page in there are one the three-dimensional relevant data of earth spacial flex, then can be in glasses
One virtual sphere of middle placement, when user rotates virtual sphere, the earth on the page also rotates, so as to
See the data of different angle;If the visual page of large screen is figure visualization system, then bead can be utilized at glasses end
Indicate the respective nodes of graph structure on large screen, when these beads of user's operation, page end will produce corresponding variation, such as save
The highlighted and displacement of point.
(3) user can be controlled after wearing augmented reality glasses by these virtual controlling components or virtual replacement object
Content in large screen processed, the specific steps are:
1) activation virtual controlling component (virtual replacement object is also same not to be described in detail individually) first, is chosen by staring, " is stared
Choose " activate virtual controlling component to refer to user's wearing spectacles and rotate head so that and virtual controlling component can be by its visual field
The foresight of the heart aims at, and gets the virtual controlling component that user is aiming in a program by the method for ray detection, with
Just subsequent various operations are carried out.
2) interactive operation (as pulled, long-press is clicked) corresponding to the various gestures of people then, is defined, various gestures are made
Be customized to different interactive operation meanings, concrete mode be according to identifiable basic gesture in augmented reality glasses, and
This basic gesture duration, position, variation in angle in user's practical operation assign different contain to these interactions
Justice.Such as in HoloLens, the basic gesture of user's interaction is Air-Tap, as shown in Fig. 2, being to stretch out index finger and thumb and phase
The process of the action mutually touched, identification tracking hand exercise is encapsulated in glasses built-in function, and developer can be in routine interface
Directly acquire user's various information such as space displacement of gesture in operation.In Air-Tap gestures, one time finger touches and is considered as a little
Hitter's gesture;If touching the rear posture for being always maintained at Fig. 2 right figures, long-pressing gesture can be considered;If keeping this posture and moving hand,
It then can be considered drag gesture.Assign the process that these interact meanings and object and program that gesture itself, user are interacting
Write it is related.By taking HoloLens as an example, if user wants one object of rotation, drag gesture, drag gesture can be used
Displacement can be mapped to the rotation angle of object in a program;If user wants a mobile object, it can also use and pull
Gesture, to avoid conflicting, the corresponding meaning of drag gesture is no longer just rotating object at this time, and the displacement of drag gesture can be in program
In be mapped to the displacements of respective objects.More rich basic gesture may be supported to keep away in other augmented reality glasses environment
Exempt to conflict.
3) operation data information caused by interaction is converted into the message format customized and is sent to large screen end, large screen
Page end corresponding interaction effect is generated according to message content;The wherein acquisition of operation data information caused by interaction, side
Formula is as follows:
User interacts operation first, is such as used if it is the variation of attribute or state value according to the type of interactive object
Button is clicked at family, or pulls slider bar, then records the numerical value that these operations generate;If it is displacement distance, need in first program
The 3-D migration amount (x, y, z) of hand, wherein x, y are obtained, z indicates hand in left and right, up and down, the position of front and back three axis respectively
It moves, actual operating result numerical value is calculated further according to interactive meaning.
By taking HoloLens as an example, when programming, needs motion vector (x, y, z) being calculated as specific number according to distinct interaction
Value, such as in HoloLens rotating object and mobile object can using drag gesture, first to the vector of generation (x, y, z) into
Row normalization obtains (x1,y1,z1).The specific calculating step of rotation process and moving operation includes:
1) rotation process needs to calculate dummy object rotation angle, and clear dummy object and object rotation on large screen
Mapping relations.By taking the virtual sphere in both directions rotation of (such as x, y) as an example, wherein the rotation around y-axis corresponds to hand or so
The mobile x generated1Value, the corresponding hand of rotation around x-axis move up and down the y of generation1Value, specific calculation are to define one
Angle constant AngleConstant, and calculate (y1* AngleConstant, x1* AngleConstant) it is used as user in glasses
End rotation process as a result, and get off as the information storage that will be sent to page end, page end constantly changes according to the information
The displaying direction for becoming sphere, generates the effect of rotation.
2) moving operation needs the mapping relations of clear dummy object movement velocity and ohject displacement on large screen.It is fixed first
One velocity constant SpeedConstant of justice, and calculate (x1*SpeedConstant,y1*SpeedConstant,z1*
SpeedConstant) as user in glasses end translation as a result, and as the information storage that will be sent to page end
Get off, page end constantly changes the position of object according to the information, generates the effect of displacement.
(4) glasses end and webpage end data communicate the specific steps are:
1) glasses end sends message:A message object message is established, including communicating pair identifier, interface name,
With specific message data, wherein attribute (network of the information as specific message data generated by the interaction that previous step generates
Communications portion is described).If included from, to, name in message object, data attributes include again in wherein data
Payload1 attributes and payload2 attributes serialize message objects JSON, obtain { " from ":xxx,"to":xxx,
"name":xxx,"data":{"payload1":xxx,"payload2":[{ xxx }, { xxx } ...], as what is customized
Message format passes to page end.
2) page end handles the message received:Page end, which receives, parses content after message, takes out interaction data, and according to
Function generates corresponding interaction effect.As user carries out moving operation, page end to the virtual replacement object of object A in large screen
Receive message { " from ":ARGlass,"to":web,"name":movePosition,"data":{"move":
[xDistance, yDistance, zDistance] } }, page end is found according to the interface name movePosition after parsing
Corresponding processing function, and data is passed to, then handle function can by A objects according to move parameters [xDistance,
YDistance, zDistance] calculating position is carried out, user is eventually led to greatly by operating dummy object (virtually substituting object)
Object A on screen is subjected to displacement, and completes moving operation.
3) in addition to this, due to page end-glasses end two-way communication, page end can also send information to glasses end, this
Allow for glasses end can be arranged other three-dimensional relevant views (glasses end utilize the data received from page end drafting can
Depending on changing view.These views such as D prism map, three-dimensional track etc., may there has been no enough in common desktop visualization
Space representation or effect are bad, but real space can be plotted in the way of 3D by augmented reality glasses, allow user relatively to have heavy
Leaching sense ground carries out observation exploration, understanding of the intensification to data) show certain data from large screen.It is set from other interactions
It is standby that (source is unlimited, and the operation sent before being glasses end, can also be the interactive operation of other equipment, such as keyboard mouse
Mark) it can also be same to the change of certain elements on large screen (having corresponding three-dimensional view or the element of relevant information in glasses)
View in step ground update glasses.Such as large screen displaying graphics visualization, in glasses it is enhanced illustrate in large screen
Consistent three-dimensional topology figure, at this moment as other interactive devices from mouse to the interaction on large screen have selected a part of subgraph into
Row screening, also correspondingly only will be presented or highlight this partial subgraph in augmented reality glasses.
(5) by the direct positioning interaction of large screen (large screen location technology):
Motivation by the direct positioning interaction of large screen is as follows:Using virtual controlling component, (the virtual object that substitutes is also same, no
Independent explanation again) interactive requirements user to the interactive function and pointing clearly to property of object of needs, passes through external corresponding control
The mode of unit is manipulated, it means that developer must have to the content and interaction demand of displaying at fingertips, for difference
Visualization works also need to different application program difference customizing virtual control assemblies.However this has two:
1) when visualizing the interaction demand complexity of works, it is impossible to which all interactive components are all produced on glasses end.
2) when the object that can be interacted is more more complex, user is more desirable to interact by way of " What You See Is What You Get ",
It is directed toward as laser pen or directly sees that some element can trigger alternative events to screen.It is proposed that large screen it is direct
The mode of positioning interaction can allow user using staring and gesture operation, occur with specific position on large screen as mouse
Interaction replaces cursor using staring, and gesture replaces right and left key, and large screen is equivalent at larger desktop, and developer also therefore can
A large amount of virtual controls are customized again to avoid for each visual demand.
The objective of large screen positioning (directly positioning) is:Selection is clicked by the real space position of large screen and user
Position calculates relative position of the interactive object in large screen on the page end page, to be got positioned at the position with program
The object set, to trigger event.If user wants to click the elements A in large screen, then user only needs in our system
Wearing spectacles make the cursor at sight center aim at A, are used in combination and click gesture click confirmation, and glasses end program can carry out position data
Coordinate transform is simultaneously sent to page end.Page end program obtain A page end coordinate (xA,yA), and in the (x of the html pagesA,
yA) position Finding Object A, alternative events included triggering A.This process is user by positioning the mistake with large-screen interactive
Journey.
The technical foundation of the large screen positioning proposed in the present invention depends on three characteristics of augmented reality glasses:
1) position of adjustment dummy object can be dragged in space.
Once 2) stare some dummy object, can obtain stare ray meet at dummy object surface (hereinafter referred to
Fixation point) coordinate.
3) there is the scenario building ability to space, the position of dummy object in space can be stored.With HoloLens
For, built in instant positioning can be carried out using the depth data in camera acquisition image with map structuring (SLAM) system
The three-dimensional reconstruction in space, if in use by function call can open detection function to surrounding space carry out scene sweep
It retouches.This system can scan and preserve the ground in residing room, wall and other facilities D Triangulation model, and with
The coordinate system in space residing for this model foundation user stores the opposite of dummy object with the form of space anchor (World Anchor)
Position so that the room model of saved mistake is once recognized when restarting application can reappear where before all dummy objects
Position.This characteristic plays a very important role for determining and keeping the spatial position of hanging large screen to have.
Large screen location technology is divided into two steps, is the positioning to large screen position itself first, followed by right
The positioning of page elements on large screen:
1) to the positioning of large screen position itself:
The position of large screen itself is positioned to determine the position (centre coordinate) of large screen, size (length and width), shape
Shape (arc, plane).Program is identical using one fitting large screen surface of these information generation, size shape and large screen
" virtual surface ".
" virtual surface (Virtual Surface) " concept proposed by the present invention, is one layer of shade to large screen shape
(the Filled Rectangle thin slice of colored border inner transparent, frame is for prompting size and location, inner transparent to prevent from blocking large-size screen monitors
Curtain, solid is the collision detection for staring ray and object), as the subject that can be interacted in glasses, simulation is empty
Between middle necessary being large screen desktop.User to the staring of virtual surface, gesture operation is equal to large screen identical bits
The operation set can thus utilize virtual surface to obtain the spatial position of user's fixation point, and utilize the length and width of virtual surface
Calculate the coordinate of fixation point apparent surface, so obtain fixation point page end relative coordinate.
The present invention is directed to the localization method of plane large screen by two of which is enumerated:
A) utilize augmented reality Vuforia tools in ImageTarget methods, large screen edge (or open net
Margin edge) 4 picture marker of quadrangle placement.When opening program, program will carry out image detection to marker, detect
A virtual vertex can be generated at image and records its position, when four vertex are detected, then established in a program
This four points are the planar object on vertex as virtual surface.Then virtual surface object is preserved using above-mentioned space anchor formula
Position.The method is not necessarily to the size of extra storage virtual surface object.
B) as shown in Fig. 4 a, 4b, a planar object is first placed in space as virtual surface, then carry out big manually
Small, position and direction adjustment.The method relies on the spatial model of SLAM structures, it is also desirable to extra storage virtual surface object
Size.It can be deposited after scanning input goes out the D Triangulation model of large screen region by opening scene scanning function
Storage is a spatial surface model, then virtual surface is fitted to by correction for direction on the large screen surface scanned, so
The size and location for continuing to manually adjust virtual surface afterwards, until virtual edge and large screen edge coincide.Significantly adjustment side
Position can be to place object using the Tap and Place functions of being provided in the developing instrument of HoloLens, the purpose of this function,
Scan pattern is opened by clicking the object for wanting to place, and object can be placed directly in the space stared ray and scanned
The position of model intersection, significantly saves the time in adjustment orientation.Fine position can utilize drag function.
The virtual surface coordinate parameters that aforesaid operations obtain are as follows:Centre coordinate be denoted as (positionX, positionY,
PositionZ), length and width are denoted as height and width, local Coordinate System vector (x, y, z).
2) to the positioning of page elements on large screen:
User stares to correspond to and stares virtual surface to large screen position, after user aims at a position
Obtainable three-dimensional fixation point is clicked using gesture, is sent in the two-dimensional coordinate percentage three-dimensional fixation point being mapped on the page
Response events are carried out to the element of corresponding position after giving page end processing, page end to receive message.
By it is most common it is vertical build plane large screen for, the intersection point for staring ray and virtual surface is position of gaze
(gazeX, gazeY, gazeZ) is set, according to virtual surface centre coordinate (positionX, positionY, positionZ) and certainly
Body coordinate system is vectorial (x, y, z), note central point (positionX, positionY, positionZ) to fixation point (gazeX,
GazeY, gazeZ) vector be v.Since the thickness of virtual surface can be disregarded, three-dimensional is as follows to two-dimensional mapping flow:
A) respectively represented out using vector v x and vy stare position relative to face plate center horizontal offset dx and
Offset of vertical amount dy.
B) virtual surface height and width is utilized to calculate coordinate of the fixation point relative to the virtual surface upper left corner, this
Relative coordinate and length and width carry out division arithmetic, and then obtain percentage (0.5+dx/width, the 0.5-dy/ of relative position
Height), be denoted as (xRatio, yRatio), represent in a browser (under full frame state) coordinate of page elements on the page
Percentage (relative to the upper left corner).
C) position percentage parameter (xRatio, yRatio) will be stared as data and be packaged into message format, pass through network
Communication is sent to page end, and interface can be named as ClickPosition.Message format is for example:{"from":" ARGlass ",
"to":" web ", " name ":"ClickPosition","data":{"xRatio":0.5,"yRatio":0.5 } }, indicate solidifying
When position percentage be (0.5,0.5).
D) xRatio and yRatio is obtained after page end parsing, (and large screen is long using the length and width H and W of Webpage
Wide measurement unit is different, and webpage is as unit of pixel p x) calculate this coordinate page end location of pixels (xRatioW,
yRatio·H).And further it is considered as click blank if element-free using html element of the Code obtaining in this position
Region;If there is element, it whether is bundled with the event of response operation according to this element, is triggered if being bundled with event, such as
Element is highlighted after being clicked.
The direct positioning interaction of large screen is exemplified below:Have one with respect to the upper left corner (10%, 10%) position in the large screen page
It is a choose after the circle that can be highlighted counted by positioning when people stares this object in glasses and carries out clicking operation
It calculates, glasses end sends { " from ":"ARGlass","to":"web","name":"ClickPosition","data":
{"xRatio":0.105,"yRatio":0.098 } it } is handled to web terminal, in rational allowable range of error and object point
It hits in response region, this object will be selected highlighted.
It is specific embodiment below:
Fig. 1 shows the flow chart of the interaction technique of specific implementation augmented reality optometric technology and large screen.Enhance first
Reality glasses need to support network communication, and have certain calculation processing ability so that it can be communicated by server-side and net
The connection of page end, defines messaging protocol format, completes intercommunication.Then it is directed to required interaction, in the procedural item at glasses end
Customize control assembly or three-dimensional display component.In the case of data information, user can directly by with the group in glasses
Part interacts, or by staring with gesture directly and large-screen interactive, to control the bandwagon effect on large screen;Simultaneously
The change of certain elements on large screen also can synchronously be updated in glasses from other interactive devices.
Fig. 2 is that the present invention uses the gesture for interaction in augmented reality glasses HoloLens:Air-tap gestures are stretched out
Index finger and thumb, open first, then close up.This gesture can be actually used in and the gesture of holographic object interaction, simulates mouse
Clicking operation can form a variety of semantemes such as long-press, movement, dragging, scaling based on this gesture, have been described above sign language in the body of the email
The concrete operation method of justice.
Fig. 3 a to 3c show the trilogy of augmented reality eyeglass design and large-screen interactive.(Fig. 3 a) Step1:Enhancing is existing
The built-in virtual controlling component that can be interacted, can manipulate the display on large screen in real glasses.(Fig. 3 b) Step2:It stares, gesture
Combining space information is detected, element on large screen is directly positioned.(Fig. 3 c) Step3:According to the data from page end, Ke Yi
A variety of visualization views are arranged to enhance data displaying in glasses end.
Fig. 4 a, 4b show in the present invention that enhancing shows that glasses create and sharp when operate to large screen direct positioning interaction
With one of the mode of virtual surface.(Fig. 4 a) uses augmented reality glasses to position large screen first, determines position, size and Orientation.
Then one virtual surface of establishment aligns with large screen is bonded and is sized unanimously, is used for instead of large screen and as interaction
The object of operation.(Fig. 4 b) user plane is to large screen by staring and gesture carries out selection operation to the element seen.User takes aim at
Accurate position can be mapped to two dimensional surface from three-dimensional and be sent to page end.
Fig. 5 shows the flow chart using augmented reality glasses to the direct positioning interaction of large screen of the present invention.With Microsoft
For HoloLens environment, by the HoloLens space reflections carried and positioning construction method (SLAM) function to large screen side
Edge position is positioned.After the position of large screen is determined, generate one with certain thickness, regular shape it is virtual
Screen panel is placed on corresponding position, and real screen is simulated at glasses end, it is therefore an objective to provide object carrier to interaction and base
It is integrated on this object in the operation of positioning.Using the ray detection mechanism of human eye sight, it can obtain and stare position in void
X on quasi- panel, y-coordinate, this coordinate represent the coordinate of the page elements (under full frame state) in large-screen browser simultaneously.
Then the element that can be chosen under large screen is interacted accordingly.
Fig. 6 shows showing in the interaction technique of augmented reality optometric technology and large screen for scheming visualization for the present invention
It is intended to.The power guiding figure of page end is illustrated in figure and virtual three-dimensional force is oriented to display effect of the subgraph in augmented reality glasses
Fruit.Large screen can be manipulated by the control panel of button and the right on following sand table with subsequent large screen, it can also
It is interacted by directly being positioned to large screen.Interaction includes choosing the point side of a figure, highlights, changes the length on size side a little
Degree, pull etc..
It is corresponding with method shown in Fig. 1, a kind of large screen and augmented reality are additionally provided in embodiment of the present invention
Data interaction system between glasses, as shown in fig. 7, the system includes:
Augmented reality glasses as glasses end there is network communication module, built-in CPU and GPU units to have initialization
Module glasses end is initialized as a client,
Large screen, for showing that the visualization webpage works realized using conventional method, the works are used as in large screen
The page end of upper displaying has initialization module page end is initialized as another client,
After the visualization works and its augmented reality demand carry out total evaluation, design needs to interact using glasses
Function, and each message interface is designed according to interactive type and data,
Message transmission module and readjustment processing module, are respectively arranged on page end and glasses end, for the two-way between the two
Letter, wherein:
Message transmission module, for establishing a message object message, including communicating pair identifier, interface name,
It is used for transmission operation data information caused by interaction with the attribute of specific message data, specific message data,
Processing module is adjusted back, after receiving message for processing, content is parsed, takes out interaction data, and produce according to function
Raw corresponding interaction effect,
Server-side has Communications Processor Module, in the traffic model at page end-server-side-glasses end, forwarding net
Communication information between page end and glasses end,
When communication data is small, server-side only needs the forwarding information between two clients;Work as communication data
When scale is bigger than normal, it is contemplated that the storage of glasses device and computing capability, server-side can be at the data of page end
Glasses end is relayed to after reason,
Server-side initialization module, for obtaining server-side IP address and port numbers be arranged, complete forwarding function or other
Data processing function,
One rear module for being different from page end and glasses end, for forwarding data between multiple client.
Further, it is described need include using the function that glasses interact:
It allows user to see the dummy object being placed in space by augmented reality glasses, and then is interacted with dummy object;
The control assembly in webpage works is visualized, virtual controlling component is converted to, is presented by augmented reality glasses empty
Quasi- control assembly, so with virtual controlling component interaction;
By stare and/or gesture realize with the virtual controlling component interaction in augmented reality glasses, and these are interacted
The data of generation pass to page end by network communication, to control the content shown in the page on large screen.
Such as:First, activation virtual controlling component is chosen by staring,
Then, the interactive operation corresponding to the various gestures of people is defined,
Gesture duration that includes basic gesture and the basic gesture in user actually uses, position, in angle
Change information,
Operation data information caused by interaction is converted into the message format customized and is sent to large screen end,
The page end of large screen generates corresponding interaction effect according to message content.
When interacting operation, these are then recorded if it is the variation of attribute or state value according to the type of interactive object
Operate the numerical value generated;If it is displacement distance, then actual behaviour is calculated according to the space displacement of interactive meaning and gesture
Make result value.
Another alternative is, it is described need include using the function that glasses interact:
One dummy object is set in augmented reality glasses, object is substituted as virtual, it can for represent page end
Depending on changing some visual pattern or object in webpage works,
User interacts operation in glasses to virtually substituting object, and the result data for operating generation passes to webpage
End, to which people to be embodied in using the result that glasses interact on content of pages shown in large screen;
The interactive operation includes clicking, pull, rotate and scaling.
Another alternative is, it is described need include using the function that glasses interact:
By the way of the direct positioning interaction of large screen, user's utilization is stared and gesture operation, on large screen and specific
Position interact, wherein:
Cursor is replaced using staring, gesture replaces right and left key, realizes simulation mouse action,
A fitting large screen surface, size shape and large screen phase are generated using the position of large screen, size and shape
Same " virtual surface ", virtual surface are used to, as the subject that can be interacted in glasses, simulate necessary being in space
Large screen desktop, virtual surface is additionally operable to the desktop of simulative display cursor,
User is equal to the operation to large screen same position to the staring of virtual surface, gesture operation,
The spatial position of user's fixation point is obtained using virtual surface, and calculates fixation point using the length and width of virtual surface
The coordinate of apparent surface, so obtain fixation point page end relative coordinate.
In conclusion the present invention makes full use of the respective advantage of augmented reality glasses and large screen, by network increasing
Strong Reality glasses and large screen are combined together, and create the interactive environment of immersion.By creating virtual controlling component and right
The direct positioning interaction of large screen cleverly solves the problems, such as to be difficult to and large-screen interactive.In use, can ensure to use
The real scene at large screen end, and it can be seen that the holographic object of glasses end virtually are observed simultaneously in family.
Augmented reality glasses are independent, light, are integrated with the mobile device of detection function.Showed based on its hologram
Function, augmented reality glasses complete with large-screen interactive during, some information can also be enhanced in reality, we
Interested part can be extracted from the visualization on screen to be rendered to holographic object and be placed in reality so that real space
In can fill more rich visual element, to enhance original visualization;Second is that it can be used as a kind of natural interactive mode,
It while controlling the behavior of large screen top view, can also interact, be provided to user good with the object placed in space
Good immersion interactive experience contributes to user to deepen the impression to data.Therefore the present invention is one novel, and is had
The work of constructive meaning.
Obviously, various changes and modifications can be made to the invention without departing from essence of the invention by those skilled in the art
God and range.In this way, if these modifications and changes of the present invention belongs to the range of the claims in the present invention and its equivalent technology
Within, then the present invention is also intended to include these modifications and variations.
Claims (18)
1. data interactive method between a kind of large screen and augmented reality glasses, includes the following steps:
Network communication is established between augmented reality glasses and large screen;
The augmented reality glasses need to support network communicating function, and built-in CPU and GPU with certain calculation processing ability
Unit;
The network communication using the traffic model at page end-server-side-glasses end into row data communication, and page end-glasses
Hold two-way communication;
Server-side is initialized first, obtain server-side IP address and port numbers are set, and completes forwarding function or other data processings
Function;
Then large screen and augmented reality glasses are initialized respectively as two clients;
It needs the url accessed to be set as address above mentioned and port numbers network communication, and completes basic communication process function;
Message is defined according to each function respectively in page end and glasses end and sends function and call back function so that one of visitor
Family end can send communication information when triggering alternative events, be forwarded via server-side, and the other end receives message and instructs and make
Corresponding processing.
2. data interactive method between a kind of large screen according to claim 1 and augmented reality glasses, it is characterised in that:Into
When row data communication, when communication data is small, server-side only needs the forwarding information between two clients;Work as communication
When data scale is bigger than normal, it is contemplated that the storage of glasses device and computing capability, server-side can to the data from page end into
Glasses end is relayed to after row processing.
3. data interactive method between a kind of large screen according to claim 1 and augmented reality glasses, it is characterised in that:It will
Using the visualization webpage works that conventional method is realized as the page end shown on large screen;
A rear end program for being different from page end and glasses end is founded, the work(for forwarding data between multiple client is possessed
Energy.
4. data interactive method between a kind of large screen according to claim 3 and augmented reality glasses, it is characterised in that:?
When establishing network communication, total evaluation is carried out to visualization works and its augmented reality demand, design needs to carry out using glasses
Interactive function, and each message interface is designed according to interactive type and data.
5. data interactive method between a kind of large screen according to claim 4 and augmented reality glasses, it is characterised in that:Institute
State need include using the function that glasses interact:
It allows user to see the dummy object being placed in space by augmented reality glasses, and then is interacted with dummy object;
The control assembly in webpage works is visualized, virtual controlling component is converted to, virtual control is presented by augmented reality glasses
Component processed, so with virtual controlling component interaction;
By stare and/or gesture realize with augmented reality glasses in virtual controlling component interaction, and these interaction generate
Data page end is passed to by network communication, to control the content shown in the page on large screen.
6. data interactive method between a kind of large screen according to claim 4 and augmented reality glasses, it is characterised in that:Institute
State need include using the function that glasses interact:
One dummy object is set in augmented reality glasses, object, the visualization for representing page end are substituted as virtual
Some visual pattern in webpage works or object,
User interacts operation in glasses to virtually substituting object, and the result data for operating generation passes to page end, from
And people is embodied in using the result that glasses interact on content of pages shown in large screen;
The interactive operation includes clicking, pull, rotate and scaling.
7. data interactive method between a kind of large screen according to claim 5 and augmented reality glasses, it is characterised in that:It is first
First, activation virtual controlling component is chosen by staring,
Then, the interactive operation corresponding to the various gestures of people is defined,
Gesture duration that includes basic gesture and the basic gesture in user actually uses, position, the change in angle
Change information,
Operation data information caused by interaction is converted into the message format customized and is sent to large screen end,
The page end of large screen generates corresponding interaction effect according to message content.
8. data interactive method between a kind of large screen according to claim 7 and augmented reality glasses, it is characterised in that:Into
When row interactive operation, according to the type of interactive object, if it is the variation of attribute or state value, then record what these operations generated
Numerical value;If it is displacement distance, then actual operating result numerical value is calculated according to the space displacement of interactive meaning and gesture.
9. data interactive method between a kind of large screen according to claim 1 and augmented reality glasses, it is characterised in that:Eye
When mirror end sends message, a message object message is established, including communicating pair identifier, interface name and specific message
The attribute of data, specific message data is used for transmission operation data information caused by interaction;
After page end processing receives message, content is parsed, takes out interaction data, and corresponding interaction is generated according to function and is imitated
Fruit.
10. data interactive method between a kind of large screen according to claim 4 and augmented reality glasses, it is characterised in that:
It is described need include using the function that glasses interact:
By the way of the direct positioning interaction of large screen, user is using staring and gesture operation, on large screen and specific position
It sets and interacts, wherein:
Cursor is replaced using staring, gesture replaces right and left key, realizes simulation mouse action,
It is identical that a fitting large screen surface, size shape and large screen are generated using the position of large screen, size and shape
" virtual surface ", virtual surface be used for be used as the subject that can be interacted in glasses, simulation space in necessary being it is big
The desktop of screen, virtual surface are additionally operable to the desktop of simulative display cursor,
User is equal to the operation to large screen same position to the staring of virtual surface, gesture operation,
The spatial position of user's fixation point is obtained using virtual surface, and it is opposite using the length and width of virtual surface to calculate fixation point
The coordinate on surface, so obtain fixation point page end relative coordinate.
11. data interaction system between a kind of large screen and augmented reality glasses, including:
Augmented reality glasses as glasses end there is network communication module, built-in CPU and GPU units to have initialization module
Glasses end is initialized as a client,
Large screen, for showing that the visualization webpage works realized using conventional method, the works are used as and are opened up on large screen
The page end shown has initialization module page end is initialized as another client,
Message transmission module and readjustment processing module, are respectively arranged on page end and glasses end, the two-way communication being used between the two,
In:
Message transmission module, for establishing a message object message, including communicating pair identifier, interface name, and tool
The attribute of body message data, specific message data is used for transmission operation data information caused by interaction,
Processing module is adjusted back, after receiving message for processing, parses content, takes out interaction data, and according to function generation pair
The interaction effect answered,
Server-side has Communications Processor Module, is used in the traffic model at page end-server-side-glasses end, converting web page end
Communication information between glasses end,
Server-side initialization module completes forwarding function or other data for obtaining server-side IP address and port numbers being arranged
Function is handled,
One rear module for being different from page end and glasses end, for forwarding data between multiple client.
12. data interaction system between a kind of large screen according to claim 11 and augmented reality glasses, it is characterised in that:
When communication data is small, server-side only needs the forwarding information between two clients;When communication data scale is bigger than normal
When, it is contemplated that the storage of glasses device and computing capability, server-side turn again after can handling the data from page end
Issue glasses end.
13. data interaction system between a kind of large screen according to claim 11 and augmented reality glasses, it is characterised in that:
After the visualization works and its augmented reality demand carry out total evaluation, design needs the function of being interacted using glasses,
And each message interface is designed according to interactive type and data.
14. data interaction system between a kind of large screen according to claim 13 and augmented reality glasses, it is characterised in that:
It is described need include using the function that glasses interact:
It allows user to see the dummy object being placed in space by augmented reality glasses, and then is interacted with dummy object;
The control assembly in webpage works is visualized, virtual controlling component is converted to, virtual control is presented by augmented reality glasses
Component processed, so with virtual controlling component interaction;
By stare and/or gesture realize with augmented reality glasses in virtual controlling component interaction, and these interaction generate
Data page end is passed to by network communication, to control the content shown in the page on large screen.
15. data interaction system between a kind of large screen according to claim 13 and augmented reality glasses, it is characterised in that:
It is described need include using the function that glasses interact:
One dummy object is set in augmented reality glasses, object, the visualization for representing page end are substituted as virtual
Some visual pattern in webpage works or object,
User interacts operation in glasses to virtually substituting object, and the result data for operating generation passes to page end, from
And people is embodied in using the result that glasses interact on content of pages shown in large screen;
The interactive operation includes clicking, pull, rotate and scaling.
16. data interaction system between a kind of large screen according to claim 14 and augmented reality glasses, it is characterised in that:
First, activation virtual controlling component is chosen by staring,
Then, the interactive operation corresponding to the various gestures of people is defined,
Gesture duration that includes basic gesture and the basic gesture in user actually uses, position, the change in angle
Change information,
Operation data information caused by interaction is converted into the message format customized and is sent to large screen end,
The page end of large screen generates corresponding interaction effect according to message content.
17. data interaction system between a kind of large screen according to claim 16 and augmented reality glasses, it is characterised in that:
When interacting operation, according to the type of interactive object, if it is the variation of attribute or state value, then records these operations and generate
Numerical value;If it is displacement distance, then actual operating result number is calculated according to the space displacement of interactive meaning and gesture
Value.
18. data interaction system between a kind of large screen according to claim 13 and augmented reality glasses, it is characterised in that:
It is described need include using the function that glasses interact:
By the way of the direct positioning interaction of large screen, user is using staring and gesture operation, on large screen and specific position
It sets and interacts, wherein:
Cursor is replaced using staring, gesture replaces right and left key, realizes simulation mouse action,
It is identical that a fitting large screen surface, size shape and large screen are generated using the position of large screen, size and shape
" virtual surface ", virtual surface be used for be used as the subject that can be interacted in glasses, simulation space in necessary being it is big
The desktop of screen, virtual surface are additionally operable to the desktop of simulative display cursor,
User is equal to the operation to large screen same position to the staring of virtual surface, gesture operation,
The spatial position of user's fixation point is obtained using virtual surface, and it is opposite using the length and width of virtual surface to calculate fixation point
The coordinate on surface, so obtain fixation point page end relative coordinate.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810338583.7A CN108762482B (en) | 2018-04-16 | 2018-04-16 | Data interaction method and system between large screen and augmented reality glasses |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810338583.7A CN108762482B (en) | 2018-04-16 | 2018-04-16 | Data interaction method and system between large screen and augmented reality glasses |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108762482A true CN108762482A (en) | 2018-11-06 |
CN108762482B CN108762482B (en) | 2021-05-28 |
Family
ID=64010612
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810338583.7A Active CN108762482B (en) | 2018-04-16 | 2018-04-16 | Data interaction method and system between large screen and augmented reality glasses |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108762482B (en) |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109901938A (en) * | 2019-02-26 | 2019-06-18 | 北京华夏电通科技有限公司 | Big screen system and visual presentation method are interacted based on WebSocket communication |
CN109920065A (en) * | 2019-03-18 | 2019-06-21 | 腾讯科技(深圳)有限公司 | Methods of exhibiting, device, equipment and the storage medium of information |
CN110264818A (en) * | 2019-06-18 | 2019-09-20 | 国家电网有限公司 | A kind of unit inlet valve dismounting training method based on augmented reality |
CN110597442A (en) * | 2019-09-20 | 2019-12-20 | 北京华捷艾米科技有限公司 | Mobile phone AR drawing method and device |
CN111309203A (en) * | 2020-01-22 | 2020-06-19 | 深圳市格上视点科技有限公司 | Method and device for acquiring positioning information of mouse cursor |
CN111818016A (en) * | 2020-06-11 | 2020-10-23 | 广州恒沙数字科技有限公司 | Method and system for realizing accurate positioning of three-dimensional space based on interface technology |
CN112506348A (en) * | 2020-12-15 | 2021-03-16 | 中国空气动力研究与发展中心计算空气动力研究所 | Configuration method and device of visual parameters of immersive flow field |
CN112527112A (en) * | 2020-12-08 | 2021-03-19 | 中国空气动力研究与发展中心计算空气动力研究所 | Multi-channel immersive flow field visualization man-machine interaction method |
CN112560642A (en) * | 2020-12-08 | 2021-03-26 | 维沃移动通信(杭州)有限公司 | Display method and device and electronic equipment |
CN112698721A (en) * | 2020-12-24 | 2021-04-23 | 上海科技大学 | Virtual reality object interaction system based on gestures |
CN113961107A (en) * | 2021-09-30 | 2022-01-21 | 西安交通大学 | Screen-oriented augmented reality interaction method and device and storage medium |
CN114026526A (en) * | 2019-06-26 | 2022-02-08 | 日商可乐普拉股份有限公司 | Program, information processing method, and information processing apparatus |
CN115268757A (en) * | 2022-07-19 | 2022-11-01 | 武汉乐庭软件技术有限公司 | Gesture interaction recognition system on picture system based on touch screen |
CN115398879A (en) * | 2020-04-10 | 2022-11-25 | 三星电子株式会社 | Electronic device for communication with augmented reality and method thereof |
WO2024040430A1 (en) * | 2022-08-23 | 2024-02-29 | Qualcomm Incorporated | Method and apparatus to extend field of view of an augmented reality device |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2009054619A2 (en) * | 2007-10-22 | 2009-04-30 | Moon Key Lee | Augmented reality computer device |
CN106210909A (en) * | 2016-08-15 | 2016-12-07 | 深圳Tcl数字技术有限公司 | TV the display processing method of content, Apparatus and system |
CN106354316A (en) * | 2016-08-31 | 2017-01-25 | 广东格兰仕集团有限公司 | Operation panel based on AR technology and image recognition technology |
CN106997235A (en) * | 2016-01-25 | 2017-08-01 | 亮风台(上海)信息科技有限公司 | Method, equipment for realizing augmented reality interaction and displaying |
CN107027015A (en) * | 2017-04-28 | 2017-08-08 | 广景视睿科技(深圳)有限公司 | 3D trends optical projection system based on augmented reality and the projecting method for the system |
CN107205034A (en) * | 2017-06-05 | 2017-09-26 | 上海联影医疗科技有限公司 | A kind of data sharing device and method |
CN107465910A (en) * | 2017-08-17 | 2017-12-12 | 康佳集团股份有限公司 | A kind of combination AR glasses carry out the method and system of AR information real time propelling movements |
CN107680165A (en) * | 2017-09-25 | 2018-02-09 | 中国电子科技集团公司第二十八研究所 | Computer operation table holography based on HoloLens shows and natural interaction application process |
CN207148561U (en) * | 2017-07-23 | 2018-03-27 | 供求世界科技有限公司 | A kind of AR systems applied to entity products information |
-
2018
- 2018-04-16 CN CN201810338583.7A patent/CN108762482B/en active Active
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2009054619A2 (en) * | 2007-10-22 | 2009-04-30 | Moon Key Lee | Augmented reality computer device |
CN106997235A (en) * | 2016-01-25 | 2017-08-01 | 亮风台(上海)信息科技有限公司 | Method, equipment for realizing augmented reality interaction and displaying |
CN106210909A (en) * | 2016-08-15 | 2016-12-07 | 深圳Tcl数字技术有限公司 | TV the display processing method of content, Apparatus and system |
CN106354316A (en) * | 2016-08-31 | 2017-01-25 | 广东格兰仕集团有限公司 | Operation panel based on AR technology and image recognition technology |
CN107027015A (en) * | 2017-04-28 | 2017-08-08 | 广景视睿科技(深圳)有限公司 | 3D trends optical projection system based on augmented reality and the projecting method for the system |
CN107205034A (en) * | 2017-06-05 | 2017-09-26 | 上海联影医疗科技有限公司 | A kind of data sharing device and method |
CN207148561U (en) * | 2017-07-23 | 2018-03-27 | 供求世界科技有限公司 | A kind of AR systems applied to entity products information |
CN107465910A (en) * | 2017-08-17 | 2017-12-12 | 康佳集团股份有限公司 | A kind of combination AR glasses carry out the method and system of AR information real time propelling movements |
CN107680165A (en) * | 2017-09-25 | 2018-02-09 | 中国电子科技集团公司第二十八研究所 | Computer operation table holography based on HoloLens shows and natural interaction application process |
Non-Patent Citations (2)
Title |
---|
MURAT KURT: "Improving Visual Perception of Augmented Reality on Mobile Devices with 3D Red-Cyan Glasses", 《IGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS CONFERENCE》 * |
张金钊等: "VR/AR/智能可穿戴交互设备实现 (产业化)", 《电脑编程技巧与维护》 * |
Cited By (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109901938B (en) * | 2019-02-26 | 2021-11-19 | 北京华夏电通科技股份有限公司 | Interactive large-screen system based on WebSocket communication and visual display method |
CN109901938A (en) * | 2019-02-26 | 2019-06-18 | 北京华夏电通科技有限公司 | Big screen system and visual presentation method are interacted based on WebSocket communication |
CN109920065A (en) * | 2019-03-18 | 2019-06-21 | 腾讯科技(深圳)有限公司 | Methods of exhibiting, device, equipment and the storage medium of information |
CN109920065B (en) * | 2019-03-18 | 2023-05-30 | 腾讯科技(深圳)有限公司 | Information display method, device, equipment and storage medium |
CN110264818A (en) * | 2019-06-18 | 2019-09-20 | 国家电网有限公司 | A kind of unit inlet valve dismounting training method based on augmented reality |
CN114026526A (en) * | 2019-06-26 | 2022-02-08 | 日商可乐普拉股份有限公司 | Program, information processing method, and information processing apparatus |
CN110597442A (en) * | 2019-09-20 | 2019-12-20 | 北京华捷艾米科技有限公司 | Mobile phone AR drawing method and device |
CN111309203A (en) * | 2020-01-22 | 2020-06-19 | 深圳市格上视点科技有限公司 | Method and device for acquiring positioning information of mouse cursor |
CN115398879A (en) * | 2020-04-10 | 2022-11-25 | 三星电子株式会社 | Electronic device for communication with augmented reality and method thereof |
CN111818016B (en) * | 2020-06-11 | 2022-03-22 | 广州恒沙数字科技有限公司 | Method and system for realizing accurate positioning of three-dimensional space based on interface technology |
CN111818016A (en) * | 2020-06-11 | 2020-10-23 | 广州恒沙数字科技有限公司 | Method and system for realizing accurate positioning of three-dimensional space based on interface technology |
CN112560642A (en) * | 2020-12-08 | 2021-03-26 | 维沃移动通信(杭州)有限公司 | Display method and device and electronic equipment |
CN112527112A (en) * | 2020-12-08 | 2021-03-19 | 中国空气动力研究与发展中心计算空气动力研究所 | Multi-channel immersive flow field visualization man-machine interaction method |
CN112506348A (en) * | 2020-12-15 | 2021-03-16 | 中国空气动力研究与发展中心计算空气动力研究所 | Configuration method and device of visual parameters of immersive flow field |
CN112698721A (en) * | 2020-12-24 | 2021-04-23 | 上海科技大学 | Virtual reality object interaction system based on gestures |
CN113961107A (en) * | 2021-09-30 | 2022-01-21 | 西安交通大学 | Screen-oriented augmented reality interaction method and device and storage medium |
CN113961107B (en) * | 2021-09-30 | 2024-04-16 | 西安交通大学 | Screen-oriented augmented reality interaction method, device and storage medium |
CN115268757A (en) * | 2022-07-19 | 2022-11-01 | 武汉乐庭软件技术有限公司 | Gesture interaction recognition system on picture system based on touch screen |
WO2024040430A1 (en) * | 2022-08-23 | 2024-02-29 | Qualcomm Incorporated | Method and apparatus to extend field of view of an augmented reality device |
Also Published As
Publication number | Publication date |
---|---|
CN108762482B (en) | 2021-05-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108762482A (en) | Data interactive method and system between a kind of large screen and augmented reality glasses | |
US12106416B2 (en) | Artificial reality system architecture for concurrent application execution and collaborative 3D scene rendering | |
US10657716B2 (en) | Collaborative augmented reality system | |
Grossman et al. | Multi-finger gestural interaction with 3d volumetric displays | |
Szalavári et al. | “Studierstube”: An environment for collaboration in augmented reality | |
US6091410A (en) | Avatar pointing mode | |
KR20220030294A (en) | Virtual user interface using peripheral devices in artificial reality environments | |
Li et al. | Cognitive issues in mobile augmented reality: an embodied perspective | |
Telkenaroglu et al. | Dual-finger 3d interaction techniques for mobile devices | |
CN111467803A (en) | In-game display control method and device, storage medium, and electronic device | |
de Haan et al. | Hybrid Interfaces in VEs: Intent and Interaction. | |
Hill et al. | Withindows: A framework for transitional desktop and immersive user interfaces | |
Garcia et al. | Modifying a game interface to take advantage of advanced I/O devices | |
Phillips | Jack user's guide | |
Belmonte et al. | Federate resource management in a distributed virtual environment | |
Wenjun | The three-dimensional user interface | |
Li | Design and Implementation of 3D Virtual Campus Online Interaction Based on Untiy3D | |
Zhang et al. | A naked-eye 3D anatomy teaching system based on Unity | |
De Gennaro | Virtual prototyping at CERN | |
Yura et al. | Design and implementation of the browser for the multimedia multi-user dungeon of the digital museum | |
Hill | Withindows: A unified framework for the development of desktop and immersive user interfaces | |
CN118840511A (en) | Visual inertial navigation hall interaction system based on Unity3D and establishment method | |
SEABRA et al. | Designing the 3D GUI of a virtual reality tool for teaching descriptive geometry | |
Fischbach et al. | A Mixed Reality Space for Tangible User Interaction. | |
Dörner et al. | Mixed Reality Techniques for Visualizations in a 3D Information Space. |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |