US20190156377A1 - Rendering virtual content based on items recognized in a real-world environment - Google Patents
Rendering virtual content based on items recognized in a real-world environment Download PDFInfo
- Publication number
- US20190156377A1 US20190156377A1 US16/189,776 US201816189776A US2019156377A1 US 20190156377 A1 US20190156377 A1 US 20190156377A1 US 201816189776 A US201816189776 A US 201816189776A US 2019156377 A1 US2019156377 A1 US 2019156377A1
- Authority
- US
- United States
- Prior art keywords
- item
- user
- wearable device
- estimated current
- current value
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000009877 rendering Methods 0.000 title description 4
- 238000000034 method Methods 0.000 claims description 54
- 230000003287 optical effect Effects 0.000 claims description 33
- 230000004044 response Effects 0.000 claims description 16
- 238000004891 communication Methods 0.000 claims description 11
- 238000005516 engineering process Methods 0.000 abstract description 28
- 230000000694 effects Effects 0.000 abstract description 20
- 230000008569 process Effects 0.000 description 20
- 238000012545 processing Methods 0.000 description 13
- 230000005855 radiation Effects 0.000 description 7
- 238000005286 illumination Methods 0.000 description 6
- 230000003190 augmentative effect Effects 0.000 description 5
- 230000003247 decreasing effect Effects 0.000 description 5
- 238000010586 diagram Methods 0.000 description 5
- 230000009466 transformation Effects 0.000 description 5
- 230000008859 change Effects 0.000 description 4
- 238000000844 transformation Methods 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 239000011521 glass Substances 0.000 description 3
- 239000004065 semiconductor Substances 0.000 description 3
- 239000002131 composite material Substances 0.000 description 2
- 238000010168 coupling process Methods 0.000 description 2
- 238000005859 coupling reaction Methods 0.000 description 2
- 230000001419 dependent effect Effects 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 238000001228 spectrum Methods 0.000 description 2
- 230000001131 transforming effect Effects 0.000 description 2
- 101000822695 Clostridium perfringens (strain 13 / Type A) Small, acid-soluble spore protein C1 Proteins 0.000 description 1
- 101000655262 Clostridium perfringens (strain 13 / Type A) Small, acid-soluble spore protein C2 Proteins 0.000 description 1
- 101000655256 Paraclostridium bifermentans Small, acid-soluble spore protein alpha Proteins 0.000 description 1
- 101000655264 Paraclostridium bifermentans Small, acid-soluble spore protein beta Proteins 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000004913 activation Effects 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 238000013475 authorization Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 239000003990 capacitor Substances 0.000 description 1
- 229920001746 electroactive polymer Polymers 0.000 description 1
- 238000004146 energy storage Methods 0.000 description 1
- 230000007717 exclusion Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000002085 persistent effect Effects 0.000 description 1
- 229920000642 polymer Polymers 0.000 description 1
- 230000000644 propagated effect Effects 0.000 description 1
- 210000001747 pupil Anatomy 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 229910001285 shape-memory alloy Inorganic materials 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0641—Shopping interfaces
- G06Q30/0643—Graphical representation of items or shoppers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/53—Querying
- G06F16/532—Query formulation, e.g. graphical querying
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/167—Audio in a user interface, e.g. using voice commands for navigating, audio feedback
-
- G06K9/00201—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/08—Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
- G06Q10/087—Inventory or stock management, e.g. order filling, procurement or balancing against orders
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0241—Advertisements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0241—Advertisements
- G06Q30/0251—Targeted advertisements
- G06Q30/0267—Wireless devices
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0278—Product appraisal
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0603—Catalogue ordering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0611—Request for offers or quotes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0621—Item configuration or customization
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0623—Item investigation
- G06Q30/0625—Directed, with specific intent or strategy
- G06Q30/0627—Directed, with specific intent or strategy using item specifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0631—Item recommendations
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0633—Lists, e.g. purchase orders, compilation or processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/08—Auctions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/04—Texture mapping
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/10—Geometric effects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/20—Scenes; Scene-specific elements in augmented reality scenes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/64—Three-dimensional objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/172—Classification, e.g. identification
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2215/00—Indexing scheme for image rendering
- G06T2215/16—Using real world measurements to influence rendering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/09—Recognition of logos
Definitions
- Users often own a large number of items. Some of these items may be valuable items that are marketable and that are capable of providing a return to the users (e.g., other users may have an interest in purchasing the item). In many cases, the users may be unaware of the value of items, and even further, the user may even forget they actually own certain items. Consequently, users often miss out on opportunities to receive a return (e.g., a monetary return) on an item with which they are willing to part. In some cases, this item may be an item that they no longer use or that they rarely use.
- a wearable device has the ability to display virtual content to a user, in an augmented reality (“AR”) environment, as the user goes about a day's activities within a household or another type of environment in which items owned by the user are present.
- AR augmented reality
- the techniques described herein identify, render, and display relevant virtual content as a user casually wears a wearable device while performing a day's activities within a household or another type of environment in which items owned by the user are present.
- the relevant content is content in which the user is interested.
- a wearable device avoids the aimless display of various content in which the user is not interested.
- the disclosed technologies tangibly improve computing efficiencies with respect to a wide variety of computing resources that would otherwise be consumed and/or utilized by improving human-computer interaction and by reducing the amount of processing cycles and storage required.
- the virtual content can be associated with items that are present in a real-world environment, and the virtual content rendered for display is relevant in the sense that the virtual content is likely to be of interest to the user.
- the virtual content can include an estimated current value of an item that, for various reasons, has become a popular item. That is, the item is in higher demand than it was previously, and there are more potential buyers of the item than there are potential sellers.
- a wearable device such as an augmented reality (“AR”) device.
- AR augmented reality
- a user of such a device might provide input indicating an interest to enter or activate a mode enabling the techniques described herein to be implemented.
- the wearable device may communicate with a system, over a network, to implement the techniques described herein.
- a wearable device described herein can recognize items owned by the user.
- the items may be items found in a kitchen of a user's house, an office space at the user's place of work, a workshop in a user's garage, etc.
- the wearable device is configured to create and maintain an inventory of items owned by a user. Accordingly, when the wearable device recognizes an item, the wearable device can check the inventory of items to determine if the recognized item has already been inventoried. If not, the wearable device adds the recognized item to the inventory of items. This process can be implemented over a period of time in order to continuously maintain and update the inventory of items.
- the inventory of items can include some or all items owned by a user. Moreover, the inventory of items can be organized so that the items owned by the user are sorted based on different environments in which they are present (e.g., a home inventory, a work or office inventory, a vacation place inventory, a kitchen inventory, a garage inventory, a master bedroom inventory, a secondary bedroom, etc.).
- environments in which they are present e.g., a home inventory, a work or office inventory, a vacation place inventory, a kitchen inventory, a garage inventory, a master bedroom inventory, a secondary bedroom, etc.
- a wearable device described herein is further configured to access an item catalog to determine an estimated current value of an item.
- the item catalog includes current listings of items for sale, and also information associated with transactions during which items are exchanged between a seller and a buyer. This information can include a number of sales of an item, a price for each sale, a buyer identification, a seller identification, item characteristics, etc.
- the estimated current value of an item owned by a user can be the most recent price at which the same item or a similar item is sold via the item catalog. In another example, the estimated current value of an item owned by a user can be an average price at which the same item and/or similar items are sold using multiple recent sales (e.g., the average price of a predetermined number of sales such as the last five item sales or the last ten item sales).
- the wearable device can generate and display the estimated current value of the item so the user is made aware of this information. This information can be useful to a user if the user is thinking about and/or willing to sell the item.
- the wearable device is configured to compare the estimated current value to a threshold value.
- the wearable device may use a policy to establish the threshold value for an item.
- the policy may calculate the threshold value as a predetermined percentage of an average sales price of an item.
- the average sales price may be calculated for a predefined period of time (e.g., the last week, the last month, the last three months, etc.).
- the threshold value is $140.
- the wearable device is configured to track or check the estimated current value of the item over a period of time as the price fluctuates based on market conditions. If the estimated current value meets or exceeds the threshold value, the wearable device can generate and display a notification informing the user of the recent price increase. The notification can be displayed while the user is located in an environment in which the item is present.
- the techniques described herein can inform the user of ownership of a “hot”, or popular, item that is in demand on an electronic commerce (e-commerce) site.
- Demand for an item can spike in response to an occurrence of a particular event.
- a recently announced change in a law that will go into effect in the near future may spike demand for an item (e.g., the law may limit sales of the item).
- the estimated current value of the item may increase dramatically.
- the death of a celebrity may spike demand for an item associated with the late celebrity (e.g., old musical records that are no longer produced).
- the estimated current value of the item may increase dramatically.
- a recent accolade or achievement by a celebrity such as a sports star (e.g., MVP of a big game) may spike demand for an item associated with the celebrity (e.g., signed memorabilia such as a football, a jersey, a helmet, etc.).
- this type of event may cause the estimated current value of the item to increase dramatically.
- the techniques described herein can help people take advantage of a recent event that causes the value of an item they own to increase dramatically. If a user is willing to sell such an item, the user may realize a monetary return that they may not have realized prior to the event occurring.
- the threshold value can be established based on contextual data.
- the contextual data can comprise a condition of the item.
- the wearable device can recognize item characteristics that are indicative of the condition of the item (e.g., excellent, good, fair, poor, etc.) and adjust a threshold value based on the condition of the item. More specifically, the threshold value may be increased if the item is in excellent condition (e.g., shows little wear and tear, is not chipped or broken, etc.) and the threshold value may be decreased if the item is in poor condition (e.g., shows a large amount of wear and tear, is chipped or broken, etc.).
- the contextual data can comprise a sales history for the user. For instance, if a number of previous sales associated with a user account indicate that the user sells items at higher prices and rarely adjusts the prices lower, then the threshold value can be adjusted higher. In contrast, if a number of previous sales associated with a user account indicate that the user sells items at lower prices (e.g., commonly discounts the sales prices to move an item quickly), then the threshold value can be adjusted lower.
- the contextual data can comprise a total number of sales for the user over a period of time. That is, the user may have previously indicated that he or she would like to sell a predetermined number of items for a defined period of time (e.g., two items per month). Provided that the user has not sold the predetermined number of items and the predefined time period is about to expire or end, then the threshold value can be adjusted lower so that the user is presented with more selling opportunities as he or she looks around an environment filled with items he or she owns. Furthermore, if the user is close to meeting or has already met the predetermined number of items to sell for the defined period of time, then the threshold value can be adjusted higher.
- a defined period of time e.g., two items per month
- the estimated current value of an item can be determined based on item characteristics shared between different items. That is, an item the user owns may not be the exact same as one or more items used to determine the estimated current value, but they may be similar in that they share one or more item characteristics that contribute to an increase in a value.
- a shared item characteristic can include a particular manufacturer or producer that is going out of business, and thus, not making items anymore. Consequently, different types of items manufactured or produced by the same company may increase in value.
- a shared item characteristic can include an authenticated signature of a celebrity (e.g., an estimated current price for a player's signed football helmet can be determined based on a recent sales price of the player's signed jersey). Therefore, the technologies described herein can recognize characteristics of a particular item owned by the user and determine an estimated current value for the particular item based on sales of the same item and/or sales of similar items, where an item is similar if it contains a shared item characteristic that contributes to an increase in value.
- a celebrity e.g., an estimated current price for a player's signed football helmet can be determined based on a recent sales price of the player's signed jersey. Therefore, the technologies described herein can recognize characteristics of a particular item owned by the user and determine an estimated current value for the particular item based on sales of the same item and/or sales of similar items, where an item is similar if it contains a shared item characteristic that contributes to an increase in value.
- the wearable device can be configured to determine and/or track use characteristics of an item in the inventory. For example, as part of maintaining an inventory of items, the wearable device can determine a total number of uses of an item, a frequency of use of an item, etc. The total number of uses of an item and/or the frequency of use may be indicative of whether the user is still interested in owning the item. For example, if the user never or rarely uses the item and the item is not a collectible item, then the user likely has little interest in owning the item. Consequently, the user is likely more willing to part with, or sell, the item.
- the wearable device can analyze the total number of uses and/or the frequency of use to determine that an item is rarely used and/or likely no longer of interest to the user.
- the wearable device can display the estimated current value and recommend that the user sell the item that is rarely used.
- the use of an item may be determined based on whether the item has been moved from one location in an environment to another location in the same environment or in a different environment. Over time, the wearable device can track a number of times an item has been moved. For instance, if the wearable device determines that a user stores or places golf clubs in different locations, then the wearable device can store information indicating that the user still plays golf.
- the threshold value described above can be established based on use characteristics (e.g., use characteristics can be part of the contextual data). For instance, if a user never or rarely uses an item, then the threshold value can be adjusted lower. If the user often uses the item (e.g., the user enjoys the item), then the threshold value can be adjusted higher.
- use characteristics e.g., use characteristics can be part of the contextual data. For instance, if a user never or rarely uses an item, then the threshold value can be adjusted lower. If the user often uses the item (e.g., the user enjoys the item), then the threshold value can be adjusted higher.
- the wearable device can be used as an effective mechanism to efficiently list the item for sale. That is, a user can look at an item he or she owns while wearing the wearable device, see an estimated current value, and provide input for the wearable device to capture a photo of the item. In response to the input, the wearable device can create a listing for the item in an e-commerce site.
- the listing may be posted to a “secondary” marketplace of the e-commerce site that includes items that a user is willing to sell without having to spend a considerable amount of time to complete a full listing for a “primary” marketplace.
- an item listing in the secondary marketplace is a limited listing that may only include a photo of an item, a seller's identification, and/or piece of contact information (e.g., an email, a phone number, etc.).
- an e-commerce system may require a seller to submit much more information such as an item title, an item category, an item description, item specifications, a price, multiple images, etc. before the full listing is posted in an item catalog.
- Some potential buyers may prefer to browse through the secondary market place to find a photo that captures an item of interest.
- a potential buyer can contact the seller and make an offer to purchase the item. Consequently, by using the wearable device and the secondary marketplace, a minimal amount of work is required by the seller to submit an item listing to an e-commerce site, where the item listing corresponds to an item the user is willing to sell (e.g., due to a recent spike in demand and price) but does not necessarily need to sell.
- the disclosed technologies improve a user experience by identifying relevant opportunities to display virtual content that is of interest to a user in a three-dimensional immersive environment as the user goes about a day's activities. That is, using the techniques described herein, a user can look at an item he or she owns, see an estimated current value, and provide input for the wearable device to submit a listing with a limited amount of information to a marketplace. This can all be done in an efficient manner without interrupting the user's activities. Moreover, the wearable device avoids the aimless display of various content in which the user is not interested.
- the disclosed technologies tangibly improve computing efficiencies with respect to a wide variety of computing resources that would otherwise be consumed and/or utilized by improving human-computer interaction and by reducing the amount of processing cycles and storage required by previous solutions.
- Technical benefits other than those specifically identified herein might also be realized through implementations of the disclosed technologies.
- FIG. 1 illustrates aspects of an exemplary computing environment in which a wearable device and/or a system can create and maintain an inventory of items owned by a user and access an item catalog to determine an estimated current value of an item owned by the user.
- FIG. 2 illustrates example marketplaces of an item catalog to which a listing for an item can be submitted.
- FIG. 3 illustrates an example where a user is looking at a portion of a real-world environment (e.g., a living room) while wearing a wearable device configured to recognize existing items, add the existing item to an item inventory, determine estimated current values for the existing items, and/or display the estimated current values in association with the existing items.
- a wearable device configured to recognize existing items, add the existing item to an item inventory, determine estimated current values for the existing items, and/or display the estimated current values in association with the existing items.
- FIG. 4 illustrates an example where a user can provide input to submit a listing to a secondary marketplace.
- FIG. 5 is a flow diagram that illustrates an example process describing aspects of the technologies disclosed herein for displaying an estimated current value of an item in a real-world environment via a wearable device.
- FIG. 6 is a flow diagram that illustrates an example process describing aspects of the technologies disclosed herein for listing an item for sale via an item catalog (e.g., an e-commerce site).
- an item catalog e.g., an e-commerce site
- FIG. 7 shows an illustrative configuration of a wearable device capable of implementing aspects of the technologies disclosed herein.
- FIG. 8 illustrates additional details of an example computer architecture for a computer capable of implementing aspects of the technologies described herein.
- This Detailed Description describes identifying, rendering, and displaying relevant virtual content as a user casually wears a wearable device while performing a day's activities within a household or another type of environment in which items owned by the user are present.
- the virtual content can be associated with the items that are present in a real-world environment, and the virtual content rendered for display is relevant in the sense that the virtual content is likely to be of interest to the user.
- a wearable device described herein can recognize items owned by the user. For instance, the items may be items found in a kitchen of a user's house, an office space at the user's place of work, a workshop in a user's garage, etc. Based on the recognition, the wearable device is configured to create and maintain an inventory of items owned by a user. The wearable device can also access an item catalog to determine an estimated current value of an item. Once the estimated current value is determined, the wearable device can generate and display the estimated current value of the item so the user is made aware of this information. This information can be useful to a user if the user is thinking about and/or willing to sell the item.
- FIG. 1 illustrates aspects of an exemplary computing environment 100 in which a wearable device and/or a system can create and maintain an inventory of items owned by a user and access an item catalog to determine an estimated current value of an item owned by the user.
- the exemplary system may comprise an electronic commerce (“e-commerce”) system 102 that includes an item catalog 104 where users and/or merchants can list real-world items for sale.
- a real-world item can be any type of item including, but not limited to, electronics, home goods, automobiles or automotive parts, clothing, musical instruments, art, jewelry, and so forth.
- the e-commerce system 102 can be implemented on one or more server computers operating in conjunction with of an e-commerce site.
- a user 106 can utilize a wearable device 108 , such as that described in further detail below with respect to FIG. 7 , to obtain image data 110 of the real-world environment in which the user 106 is currently located.
- the wearable device 108 can include an optical device configured to scan the real-world environment of the user 106 to obtain the image data 110 (e.g., recognize objects in the real-world environment).
- the image data 110 of the real-world environment includes recognizable existing items 112 that are physically present in the real-world environment.
- the wearable device 108 can send the image data 110 to the e-commerce system 102 and the e-commerce system 102 can recognize the existing items 112 .
- the wearable device 108 and/or the e-commerce system 102 are configured to create and maintain an inventory of items owned and/or possessed by the user (i.e., the item inventory 114 ).
- the item inventory 114 can be maintained in association with a user account.
- the wearable device 108 Upon recognizing an item 112 , the wearable device 108 is configured to check the item inventory 114 to determine if the recognized item has already been added. If not, the wearable device 108 adds the recognized item to the item inventory 114 .
- the user 106 can activate an “item inventory” operation mode for the wearable device 108 , and based on the activation, the inventorying process can continually be implemented over a period of time in order to ensure the item inventory 114 is accurate.
- the item inventory 114 can include all items owned and/or possessed by the user 106 and/or a group of people associated with the user 106 (e.g., other family members).
- the item inventory 114 can be organized so that the items owned and/or possessed by the user 106 are sorted based on different real-world environments in which they are present (e.g., a home inventory, a work or office inventory, a vacation place inventory, a kitchen inventory, a garage inventory, a master bedroom inventory, a secondary bedroom, etc.).
- a home inventory e.g., a home inventory, a work or office inventory, a vacation place inventory, a kitchen inventory, a garage inventory, a master bedroom inventory, a secondary bedroom, etc.
- the wearable device 108 via one or more network(s) 116 and the e-commerce system 102 , is further configured to access the item catalog 104 to determine an estimated current value 118 of an item in the item inventory 114 .
- the item may be an item 112 located in the real-world environment in which the user 106 is currently located.
- the item catalog 104 includes current listings 120 of items for sale via an e-commerce site operated by the e-commerce system 102 , for example.
- the item catalog 104 may also store transaction information 122 associated with exchanges of items between sellers and buyers.
- the transaction information 122 can include a number of sales of an item, a price for each sale, a buyer identification, a seller identification, item characteristics, etc.
- the estimated current value 118 of an item 112 can be the most recent price at which the same item or a similar item is sold via the item catalog 104 .
- the estimated current value 118 of an item 112 can be an average price at which the same item and/or similar items are sold using multiple recent sales (e.g., the average price of a predetermined number of sales such as the last five item sales or the last ten item sales).
- an inventory value notification tool 124 (e.g., a software component or module) of the wearable device 108 can compare the estimated current value 118 to a threshold value 126 . If the estimated current value 118 (e.g., $160) is greater than the threshold value 126 (e.g., $140), the inventory value notification tool 124 can display the estimated current value 118 of the item 112 in the user's view of the real-world environment so the user 106 is made aware of this information. This information can be useful to a user 106 if the user 106 is thinking about and/or willing to sell the item 112 .
- a threshold value 126 e.g., $140
- the user 106 can specifically define the threshold value 126 at which he or she is willing to sell an item 112 . Based on market conditions that cause the price of the item 112 to fluctuate (e.g., increase and/or decrease over a period of time), the user 106 can be notified when the estimated current value 118 of the item 112 is more than the threshold value 126 defined by the user 106 .
- the threshold value 126 can be established by the e-commerce system 102 and/or the wearable device 108 .
- the threshold value 126 can be calculated as a predetermined percentage of an average sales price of an item.
- the average sales price may be calculated for a predefined period of time (e.g., the last week, the last month, the last three months, etc.).
- the threshold value is $140.
- the predetermined percentage is typically more than one hundred percent (e.g., 120%, 150%, 200%, etc.) and can be established by the e-commerce system 102 and/or the wearable device 108 to determine when a price increase is substantial. Further, the e-commerce system 102 and/or the wearable device 108 can define different predetermined percentages for different categories of items, and use the appropriate predetermined percentage to determine a threshold value 126 based on a category to which the item 112 belongs. This accounts for the possibility that a substantial price increase for one category of items may be different than a substantial price increase for another category of items. In some instances, the user 106 can define the predetermined percentage used to calculate the threshold value 126 for the items 112 the user owns and/or possesses.
- a current estimated price 118 exceeding the threshold value 126 may be a condition for displaying relevant virtual content.
- the user 106 may not want to know the estimated current prices of all the items in an immersive real-world environment, as this may be distraction. Moreover, this can cause unnecessary consumption of computer resources. Instead, the user 106 may only desire to see estimated current prices that have recently increased dramatically.
- the item 112 for which the estimated current value 118 is determined and/or displayed may be a popular item 128 .
- a popular item 128 can be an item that is in high demand due to a recent event. For example, a recently announced change in a law that will go into effect in the near future may spike demand for an item (e.g., the law may limit sales of the item). In another example, the death of a celebrity (e.g., a musician) may spike demand for an item associated with the late celebrity (e.g., old musical records that are no longer produced).
- a recent accolade or achievement by a celebrity such as a sports star (e.g., MVP of a big game) may spike demand for an item associated with the celebrity (e.g., signed memorabilia such as a football, a jersey, a helmet, etc.).
- the price of the popular item 128 increases because there are many more potential buyers than there are potential sellers. Moreover, the popular item 128 may have experienced a recent increase in the number of sales, as referenced by 130 .
- the threshold value 126 can be established based on contextual data.
- the contextual data can comprise a condition of the item.
- the wearable device can recognize item characteristics that are indicative of the condition of the item (e.g., excellent, good, fair, poor, etc.) and adjust a threshold value 126 based on the condition of the item. More specifically, the threshold value 126 may be increased if the item is in excellent condition (e.g., shows little wear and tear, is not chipped or broken, etc.) and the threshold value 126 may be decreased if the item is in poor condition (e.g., shows a large amount of wear and tear, is chipped or broken, etc.).
- the contextual data can comprise a sales history for the user. For instance, if a number of previous sales associated with a user account indicate that the user sells items at higher prices and rarely adjusts the prices lower, then the threshold value 126 can be adjusted higher. In contrast, if a number of previous sales associated with a user account indicate that the user sells items at lower prices (e.g., commonly discounts the sales prices to move an item quickly), then the threshold value 126 can be adjusted lower.
- the contextual data can comprise a total number of sales for the user over a period of time. That is, the user may have previously indicated that he or she would like to sell a predetermined number of items for a defined period of time (e.g., two items per month). Provided that the user has not sold the predetermined number of items and the predefined time period is about to expire or end, then the threshold value 126 can be adjusted lower so that the user is presented with more selling opportunities as he or she looks around an environment filled with items he or she owns. Furthermore, if the user is close to meeting or has already met the predetermined number of items to sell for the defined period of time, then the threshold value 126 can be adjusted higher.
- a defined period of time e.g., two items per month
- the transaction information 122 in the item catalog 104 can be accessed to determine that a number of sales of the item during a recent predefined period of time (e.g., the last three days, the last week, the last month) is greater than a threshold number of sales of the item.
- the threshold number of sales is established as a predetermined percentage of a number of sales of the item during a predefined period of time that precedes the recent predefined period of time (e.g., the three days before the last three days, the week before the last week, the month before the last month).
- the predetermined percentage associated with a number of sales is typically more than one hundred percent (e.g., 120%, 150%, 200%, etc.) so that the e-commerce system 102 and/or the wearable device 108 can determine when a number of sales dramatically increases from one time period to the next.
- the e-commerce system 102 and/or the wearable device 108 can define different predetermined percentages for different categories of items, and use the appropriate predetermined percentage to determine when sales of an item have greatly increased.
- the inventory value notification tool 124 can display the estimated current value 118 of the item 112 in the user's view of the real-world environment based on an increased number of sales of the item.
- the estimated current value 118 of an item 112 can be determined based on item characteristics shared between different items. That is, item 112 may not be the exact same as one or more items used to determine the estimated current value 118 , but they may be similar in that they share one or more item characteristics that contribute to an increase in a value.
- a shared item characteristic can include a particular manufacturer or producer that is going out of business, and thus, not making items anymore. Consequently, different types of items manufactured or produced by the same company may increase in value.
- a shared item characteristic can include an authenticated signature of a celebrity (e.g., an estimated current price for a player's signed football helmet can be determined based on a recent sales price of the player's signed jersey). Therefore, the technologies described herein can recognize characteristics of a particular item owned by the user and determine an estimated current value for the particular item based on sales of the same item and/or sales of similar items, where an item is similar if it contains a shared item characteristic that contributes to an increase in value.
- a celebrity e.g., an estimated current price for a player's signed football helmet can be determined based on a recent sales price of the player's signed jersey. Therefore, the technologies described herein can recognize characteristics of a particular item owned by the user and determine an estimated current value for the particular item based on sales of the same item and/or sales of similar items, where an item is similar if it contains a shared item characteristic that contributes to an increase in value.
- the wearable device 108 can be configured to determine and/or track use characteristics of an item 112 and store the user characteristics as item use information 132 in the item inventory 114 .
- the wearable device 108 can determine a total number of uses of an item 112 , a frequency of use of an item 112 , etc.
- the use of an item may be determined based on whether the item has been moved from one location in an environment to another location in the same environment or in a different environment.
- the total number of uses of an item 112 and/or the frequency of use of the item 112 may be indicative of whether the user is still interested in owning the item 112 .
- the determination and display of the estimated current value 118 can be based on a determination that the item 112 has not been used in a predefined period of time (e.g., the last month, the last three months, the last year).
- the determination and display of the estimated current value 118 can be based on a determination that the item 112 has been used a number of times in a predefined period of time (e.g., once in a year), the number of times being less than a threshold number of times established for infrequent use (e.g., five times in a year).
- the threshold value 126 described above can be established based on use characteristics (e.g., use characteristics can be part of the contextual data). For instance, if a user never or rarely uses an item, then the threshold value 126 can be adjusted lower. If the user often uses the item (e.g., the user enjoys the item), then the threshold value 126 can be adjusted higher. In some embodiments, the threshold value 126 can be based on a frequency of use of an item, e.g., a frequency that an item is moved from one position to another over the course of a time period. In some embodiments, the threshold value 126 can be increased or decreased based on contextual data indicating a frequency of use or a frequency of movement of an item.
- use characteristics e.g., use characteristics can be part of the contextual data. For instance, if a user never or rarely uses an item, then the threshold value 126 can be adjusted lower. If the user often uses the item (e.g., the user enjoys the item), then the threshold value 126 can be adjusted higher.
- an increased frequency in use can decrease the threshold value 126 .
- a decreased frequency in use can increase the threshold value 126 .
- a decreased frequency in use can decrease the threshold value 126 .
- increased frequency in use can increase the threshold value 126 .
- FIG. 1 further illustrates that the wearable device 108 can include an item listing tool 134 (e.g., a software component or module).
- the item listing tool 134 implements functionality that enables the user 106 to create a listing for the item 112 in the item catalog 104 (e.g., on an e-commerce site).
- the item listing tool 134 can configure a control that enables a user 106 of the wearable device 108 to provide input instructing the wearable device 108 to create the listing for the item.
- the control can be a displayed graphical element configured to receive user input (e.g., a user reaches out to virtually touch a selectable menu option), or the control can be configured to receive an audible command as the user input.
- the item listing tool 134 can locate the item 112 , focus on the item 112 , capture a photo of the item using a camera of the wearable device 108 . The item listing tool 134 can then submit the listing for the item 112 along with the photo to the e-commerce system 102 and/or the item catalog 104 . In some examples, the item listing tool can use the previously obtained image data 110 to locate the item 112 , and submit the image data corresponding to the item 112 along with the listing for the item.
- FIG. 2 illustrates example marketplaces of the item catalog 104 to which the listing 200 for the item 112 can be submitted.
- the item catalog 104 includes a primary marketplace 202 that includes full item listings 204 and a secondary marketplace 206 that includes partial item listings 208 .
- a partial item listing 208 in the secondary marketplace 206 is a limited listing that may only include a photo of an item 210 , a seller's identification along with piece of contact information 212 (e.g., an email, a phone number, etc.), and/or a suggested sale price 214 .
- the e-commerce system 102 may require a seller to submit much more information such as an item title, an item category, an item description, item specifications, a price, multiple images, etc. before the full listing is posted in an item catalog.
- a user can look at an item he or she owns, see an estimated current value, and provide input for the wearable device to submit a listing with a limited amount of information to a marketplace. This can all be done in an efficient manner without interrupting the user's activities.
- An example of a wearable device 108 can include an augmented reality (“AR”) device.
- An AR device is a computing device capable of providing a view of the real-world environment within which physical objects are augmented or supplemented by computer-generated (“CG”) sensory input (e.g., sound, video, graphics, etc.).
- CG computer-generated
- an AR device might provide a view of the real-world environment with a rendering of virtual content as an overlay. Additional details regarding the configuration and operation of a wearable device 108 capable of providing this functionality is provided below with regard to FIG. 7 .
- the virtual content can be displayed in an AR environment, as well as other types of environments, such as mixed reality (“MR”) environments or virtual reality (“VR”) environments.
- MR mixed reality
- VR virtual reality
- the technologies described herein can be implemented on a variety of different types of wearable devices 108 configured with a variety of different operating systems, hardware components, and/or installed applications.
- the wearable device 108 can be implemented by the following example wearable devices: GOOGLE GLASS, MAGIC LEAP ONE, MICROSOFT HOLOLENS, META 2, SONY SMART EYEGLASS, HTC VIVE, OCULUS GO, PLAYSTATION VR, or WINDOWS mixed reality headsets.
- embodiments of the present disclosure can be implemented in any AR-capable device, which is different than goggles or glasses that obstruct a user's view of real-world objects, e.g., actual reality.
- the techniques described herein can be device and/or operating system agnostic.
- FIG. 3 illustrates an example 300 where a user 302 (e.g., user 106 ) is looking at a portion of a real-world environment (e.g., a living room) while wearing a wearable device 304 (e.g., wearable device 108 ) configured to recognize existing items, add the existing item to an item inventory, determine estimated current values for the existing items, and/or display the estimated current values in association with the existing items.
- the view into the living room provided via the wearable device 304 comprises a real-world view from the perspective of the user 302 .
- the living room includes a piece of art 306 on a wall and a record player 308 sitting on a shelf.
- the wearable device 304 can recognize the piece of art 306 and the record player 308 and add the items to an inventory of items owned and/or possessed by the user 302 (e.g., a home inventory of items, a living room inventory of items, etc.). Furthermore, when the items are in the view of the user 302 , as illustrated, the wearable device 304 can access an item catalog to determine estimated current values for the same or similar items. In some examples, the wearable device 304 compares the estimated current values to threshold values, and if the estimated current values are greater than the threshold values, the wearable device 304 displays the estimated current values close to the items. As shown, the piece of art 306 on the wall has an estimated current value of $599 and the record player 308 sitting on the shelf has an estimated current value of $199.
- the displayed values may aid the user 302 in making a decision to sell an item.
- the wearable device 302 can configure a control that receives input (e.g. an instruction) from the user to submit an item listing to an item catalog 104 .
- FIG. 4 illustrates an example 400 where the user 302 provides an audible command 402 to “list the piece of art in the secondary marketplace”.
- the wearable device 304 is configured to capture a photo of the piece of art 306 and display a prompt 404 for the user 302 to confirm the listing of the art 306 in the secondary marketplace 206 .
- the wearable device 302 via a user account, can create a partial item listing that includes the photo of the art 306 , an identification of the user, contact information of the user, and/or the estimated current value, and submit the partial item listing to the secondary marketplace 206 .
- a user may be required to enable a feature and/or enter a particular operation mode.
- the user 106 may need to provide permission and/or authorization for the wearable device 108 to implement the described techniques.
- FIGS. 5 and 6 are flow diagrams that each illustrate an example process describing aspects of the technologies presented herein with reference to FIGS. 1-4 .
- a process is illustrated as a collection of blocks in a logical flow graph, which represent a sequence of operations that can be implemented in hardware, software, or a combination thereof.
- the blocks represent computer-executable instructions that, when executed by one or more processors, perform the recited operations.
- FIG. 5 is a flow diagram that illustrates an example process 500 describing aspects of the technologies disclosed herein for displaying an estimated current value of an item in a real-world environment via a wearable device.
- the process 500 begins at block 502 , where a real-world environment is scanned by a wearable device to obtain image data.
- the process 500 proceeds to block 504 where the wearable device uses the image data to recognize an item that exists in the real-world environment.
- the item is added on an item inventory associated with the real-world environment.
- the wearable device accesses an item catalog to determine an estimated current value for the item.
- the process 500 proceeds to block 510 where it is determined that the estimated current value is more than a threshold value.
- the estimated current value is displayed in association with the item via a display device of the wearable device.
- FIG. 6 is a flow diagram that illustrates an example process 600 describing aspects of the technologies disclosed herein for listing an item for sale via an item catalog (e.g., an e-commerce site).
- an item catalog e.g., an e-commerce site
- the process 600 begins at block 602 , where user input that instructs the wearable device to list the item for sale via an item catalog is received.
- the instruction can specify that the listing be submitted to the secondary marketplace so the user does not have to interrupt a current activity to complete a full item listing.
- the process proceeds to block 604 where the wearable device captures a photo of the item.
- the wearable device submits the listing for the item to the item catalog.
- the listing can include the photo of the item, the seller's identification and a piece of contact information, and/or the estimated current value as a suggested sales price.
- FIG. 7 shows an illustrative configuration of a wearable device 700 (e.g., a headset system, a head-mounted display, etc.) capable of implementing aspects of the technologies disclosed herein.
- the wearable device 700 includes an optical system 702 with an illumination engine 704 to generate electro-magnetic (“EM”) radiation that includes both a first bandwidth for generating computer-generated (“CG”) images and a second bandwidth for tracking physical objects.
- EM electro-magnetic
- the first bandwidth may include some or all of the visible-light portion of the EM spectrum whereas the second bandwidth may include any portion of the EM spectrum that is suitable to deploy a desired tracking protocol.
- the optical system 702 further includes an optical assembly 706 that is positioned to receive the EM radiation from the illumination engine 704 and to direct the EM radiation (or individual bandwidths of thereof) along one or more predetermined optical paths.
- the illumination engine 704 may emit the EM radiation into the optical assembly 706 along a common optical path that is shared by both the first bandwidth and the second bandwidth.
- the optical assembly 706 may also include one or more optical components that are configured to separate the first bandwidth from the second bandwidth (e.g., by causing the first and second bandwidths to propagate along different image-generation and object-tracking optical paths, respectively).
- the optical assembly 706 includes components that are configured to direct the EM radiation with respect to one or more components of the optical assembly 706 and, more specifically, to direct the first bandwidth for image-generation purposes and to direct the second bandwidth for object-tracking purposes.
- the optical system 702 further includes a sensor 708 to generate object data in response to a reflected-portion of the second bandwidth, i.e. a portion of the second bandwidth that is reflected off an object that exists within a real-world environment.
- the wearable device 700 may utilize the optical system 702 to generate a composite view (e.g., from a perspective of a user 106 that is wearing the wearable device 700 ) that includes both one or more CG images and a view of at least a portion of the real-world environment that includes the object.
- the optical system 702 may utilize various technologies such as, for example, AR technologies to generate composite views that include CG images superimposed over a real-world view.
- the optical system 702 may be configured to generate CG images via a display panel.
- the display panel can include separate right eye and left eye transparent display panels.
- the display panel can include a single transparent display panel that is viewable with both eyes and/or a single transparent display panel that is viewable by a single eye only. Therefore, it can be appreciated that the technologies described herein may be deployed within a single-eye Near Eye Display (“NED”) system (e.g., GOOGLE GLASS) and/or a dual-eye NED system (e.g., OCULUS RIFT).
- NED Near Eye Display
- the wearable device 700 is an example device that is used to provide context and illustrate various features and aspects of the user interface display technologies and systems disclosed herein. Other devices and systems may also use the interface display technologies and systems disclosed herein.
- the display panel may be a waveguide display that includes one or more diffractive optical elements (“DOEs”) for in-coupling incident light into the waveguide, expanding the incident light in one or more directions for exit pupil expansion, and/or out-coupling the incident light out of the waveguide (e.g., toward a user's eye).
- DOEs diffractive optical elements
- the wearable device 1200 may further include an additional see-through optical component.
- a controller 710 is operatively coupled to each of the illumination engine 704 , the optical assembly 706 (and/or scanning devices thereof,) and the sensor 708 .
- the controller 710 includes one or more logic devices and one or more computer memory devices storing instructions executable by the logic device(s) to deploy functionalities described herein with relation to the optical system 702 .
- the controller 710 can comprise one or more processing units 712 , one or more computer-readable media 714 for storing an operating system 716 and data such as, for example, image data that defines one or more CG images and/or tracking data that defines one or more object tracking protocols.
- the computer-readable media 714 may further include an image-generation engine 718 that generates output signals to modulate generation of the first bandwidth of EM radiation by the illumination engine 704 and also to control the scanner(s) to direct the first bandwidth within the optical assembly 706 .
- the scanner(s) direct the first bandwidth through a display panel to generate CG images that are perceptible to a user, such as a user interface.
- the computer-readable media 714 may further include an object-tracking engine 720 that generates output signals to modulate generation of the second bandwidth of EM radiation by the illumination engine 704 and also the scanner(s) to direct the second bandwidth along an object-tracking optical path to irradiate an object.
- the object tracking engine 720 communicates with the sensor 708 to receive the object data that is generated based on the reflected-portion of the second bandwidth.
- the object tracking engine 720 then analyzes the object data to determine one or more characteristics of the object such as, for example, a depth of the object with respect to the optical system 702 , an orientation of the object with respect to the optical system 702 , a velocity and/or acceleration of the object with respect to the optical system 702 , or any other desired characteristic of the object.
- the components of the wearable device 700 are operatively connected, for example, via a bus 722 , which can include one or more of a system bus, a data bus, an address bus, a PCI bus, a Mini-PCI bus, and any variety of local, peripheral, and/or independent buses.
- the wearable device 700 may further include various other components, for example cameras (e.g., camera 724 ), microphones (e.g., microphone 726 ), accelerometers, gyroscopes, magnetometers, temperature sensors, touch sensors, biometric sensors, other image sensors, energy-storage components (e.g. battery), a communication facility, a GPS receiver, etc.
- the wearable device 700 can include one or more eye gaze sensors 728 .
- an eye gaze sensor 728 is user facing and is configured to track the position of at least one eye of a user.
- eye position data e.g., determined via use of eye gaze sensor 728
- image data e.g., determined via use of the camera 724
- other data can be processed to identify a gaze path of the user. That is, it can be determined that the user is looking at a particular section of a hardware display surface, a particular real-world object or part of a real-world object in the view of the user, and/or a rendered object or part of a rendered object displayed on a hardware display surface.
- the wearable device 700 can include an actuator 729 .
- the processing units 712 can cause the generation of a haptic signal associated with a generated haptic effect to actuator 729 , which in turn outputs haptic effects such as vibrotactile haptic effects, electrostatic friction haptic effects, or deformation haptic effects.
- Actuator 729 includes an actuator drive circuit.
- the actuator 729 may be, for example, an electric motor, an electro-magnetic actuator, a voice coil, a shape memory alloy, an electro-active polymer, a solenoid, an eccentric rotating mass motor (“ERM”), a linear resonant actuator (“LRA”), a piezoelectric actuator, a high bandwidth actuator, an electroactive polymer (“EAP”) actuator, an electrostatic friction display, or an ultrasonic vibration generator.
- an electric motor an electro-magnetic actuator, a voice coil, a shape memory alloy, an electro-active polymer, a solenoid, an eccentric rotating mass motor (“ERM”), a linear resonant actuator (“LRA”), a piezoelectric actuator, a high bandwidth actuator, an electroactive polymer (“EAP”) actuator, an electrostatic friction display, or an ultrasonic vibration generator.
- wearable device 700 can include one or more additional actuators 729 .
- the actuator 729 is an example of a haptic output device, where a haptic output device is a device configured to output haptic effects, such as vibrotactile haptic effects, electrostatic friction haptic effects, or deformation haptic effects, in response to a drive signal.
- the actuator 729 can be replaced by some other type of haptic output device.
- wearable device 700 may not include actuator 729 , and a separate device from wearable device 700 includes an actuator, or other haptic output device, that generates the haptic effects, and wearable device 700 sends generated haptic signals to that device through a communication device.
- the processing unit(s) 712 can represent, for example, a CPU-type processing unit, a GPU-type processing unit, a field-programmable gate array (“FPGA”), another class of digital signal processor (“DSP”), or other hardware logic components that may, in some instances, be driven by a CPU.
- FPGA field-programmable gate array
- DSP digital signal processor
- illustrative types of hardware logic components that can be used include Application-Specific Integrated Circuits (“ASICs”), Application-Specific Standard Products (“ASSPs”), System-on-a-Chip Systems (“SOCs”), Complex Programmable Logic Devices (“CPLDs”), etc.
- computer-readable media such as computer-readable media 714
- Computer-readable media can also store instructions executable by external processing units such as by an external CPU, an external GPU, and/or executable by an external accelerator, such as an FPGA type accelerator, a DSP type accelerator, or any other internal or external accelerator.
- external processing units such as by an external CPU, an external GPU, and/or executable by an external accelerator, such as an FPGA type accelerator, a DSP type accelerator, or any other internal or external accelerator.
- an external accelerator such as an FPGA type accelerator, a DSP type accelerator, or any other internal or external accelerator.
- at least one CPU, GPU, and/or accelerator is incorporated in a computing device, while in some examples one or more of a CPU, GPU, and/or accelerator is external to a computing device.
- the wearable device 700 is configured to interact, via network communications, with a network device (e.g., a network server or a cloud server) to implement the configurations described herein. For instance, the wearable device 700 may collect data and send the data over network(s) to the network device. The network device may then implement some of the functionality described herein. Subsequently, the network device can cause the wearable device 700 to display an item and/or instruct the wearable device 700 to perform a task.
- a network device e.g., a network server or a cloud server
- Computer-readable media can include computer storage media and/or communication media.
- Computer storage media can include one or more of volatile memory, nonvolatile memory, and/or other persistent and/or auxiliary computer storage media, removable and non-removable computer storage media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules, or other data.
- computer storage media includes tangible and/or physical forms of media included in a device and/or hardware component that is part of a device or external to a device, including but not limited to random access memory (“RAM”), static random-access memory (“SRAM”), dynamic random-access memory (“DRAM”), phase change memory (“PCM”), read-only memory (“ROM”), erasable programmable read-only memory (“EPROM”), electrically erasable programmable read-only memory (“EEPROM”), flash memory, rotating media, optical cards or other optical storage media, magnetic storage, magnetic cards or other magnetic storage devices or media, solid-state memory devices, storage arrays, network attached storage, storage area networks, hosted computer storage or any other storage memory, storage device, and/or storage medium that can be used to store and maintain information for access by a computing device.
- RAM random access memory
- SRAM static random-access memory
- DRAM dynamic random-access memory
- PCM phase change memory
- ROM read-only memory
- EPROM erasable programmable read-only memory
- communication media can embody computer-readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave, or other transmission mechanism.
- a modulated data signal such as a carrier wave, or other transmission mechanism.
- computer storage media does not include communication media. That is, computer storage media does not include communications media consisting solely of a modulated data signal, a carrier wave, or a propagated signal, per se.
- the wearable device 108 can also be configured to use network communications to interact with an e-commerce provider of an electronic marketplace.
- the e-commerce provider creates and maintains catalog(s) of items.
- the items can be bought and/or sold by registered users and/or merchants.
- the e-commerce provider can comprise resources to collect and store information related to an item, to display the information related to the item to a potential buyer, to conduct online auctions of an item, to match a buyer of an item with a seller of the item, to process a transaction, etc.
- FIG. 8 shows additional details of an example computer architecture for a computer capable of executing the functionalities described herein such as, for example, those described with reference to FIGS. 1-7 , or any program components thereof as described herein.
- the computer architecture 800 illustrated in FIG. 8 illustrates an architecture for a server computer, or network of server computers, or any other type of computing device suitable for implementing the functionality described herein.
- the computer architecture 800 may be utilized to execute any aspects of the software components presented herein, such as software components for implementing the e-commerce system 102 .
- the computer architecture 800 illustrated in FIG. 8 includes a central processing unit 802 (“CPU”), a system memory 804 , including a random-access memory 806 (“RAM”) and a read-only memory (“ROM”) 808 , and a system bus 810 that couples the memory 804 to the CPU 802 .
- the computer architecture 800 further includes a mass storage device 812 for storing an operating system 814 , other data, and one or more application programs.
- the mass storage device 812 may store the item catalog 104 and/or the user's item inventory 114 .
- the mass storage device 812 is connected to the CPU 802 through a mass storage controller (not shown) connected to the bus 810 .
- the mass storage device 812 and its associated computer-readable media provide non-volatile storage for the computer architecture 800 .
- computer-readable media can be any available computer storage media or communication media that can be accessed by the computer architecture 800 .
- the computer architecture 800 may operate in a networked environment using logical connections to remote computers through a network 850 .
- the computer architecture 800 may connect to the network 850 through a network interface unit 816 connected to the bus 810 . It should be appreciated that the network interface unit 816 also may be utilized to connect to other types of networks and remote computer systems.
- the computer architecture 800 also may include an input/output controller 818 for receiving and processing input from a number of other devices, including a keyboard, mouse, or electronic stylus. Similarly, the input/output controller 818 may provide output to a display screen, a printer, or other type of output device. It should also be appreciated that a computing system can be implemented using the disclosed computer architecture 800 to communicate with other computing systems.
- the software components described herein may, when loaded into the CPU 802 and executed, transform the CPU 802 and the overall computer architecture 800 from a general-purpose computing system into a special-purpose computing system customized to facilitate the functionality presented herein.
- the CPU 802 may be constructed from any number of transistors or other discrete circuit elements, which may individually or collectively assume any number of states. More specifically, the CPU 802 may operate as a finite-state machine, in response to executable instructions contained within the software modules disclosed herein. These computer-executable instructions may transform the CPU 802 by specifying how the CPU 802 transitions between states, thereby transforming the transistors or other discrete hardware elements constituting the CPU 802 .
- Encoding the software modules presented herein also may transform the physical structure of the computer-readable media presented herein.
- the specific transformation of physical structure may depend on various factors, in different implementations of this description. Examples of such factors may include, but are not limited to, the technology used to implement the computer-readable media, whether the computer-readable media is characterized as primary or secondary storage, and the like.
- the computer-readable media is implemented as semiconductor-based memory
- the software disclosed herein may be encoded on the computer-readable media by transforming the physical state of the semiconductor memory.
- the software may transform the state of transistors, capacitors, or other discrete circuit elements constituting the semiconductor memory.
- the software also may transform the physical state of such components in order to store data thereupon.
- the computer-readable media disclosed herein may be implemented using magnetic or optical technology.
- the software presented herein may transform the physical state of magnetic or optical media, when the software is encoded therein. These transformations may include altering the magnetic characteristics of particular locations within given magnetic media. These transformations also may include altering the physical features or characteristics of particular locations within given optical media, to change the optical characteristics of those locations. Other transformations of physical media are possible without departing from the scope and spirit of the present description, with the foregoing examples provided only to facilitate this discussion.
- the computer architecture 800 may include other types of computing devices, including smartphones, embedded computer systems, tablet computers, other types of wearable computing devices, and other types of computing devices known to those skilled in the art. It is also contemplated that the computer architecture 800 may not include all of the components shown in FIG. 8 , may include other components that are not explicitly shown in FIG. 8 , or may utilize an architecture completely different than that shown in FIG. 8 .
- Example Clause A a method comprising: obtaining, by a wearable device, image data from an optical device configured to scan a real-world environment; recognizing, based on the image data, an item that exists in the real-world environment; adding the item to an inventory of items associated with the real-world environment; accessing, by one or more processors, an item catalog to determine an estimated current value for the item; determining that the estimated current value is more than a threshold value; in response to determining that the estimated current value is more than the threshold value, displaying the estimated current value in association with the item on a display device of the wearable device; configuring a control that enables a user of the wearable device to list the item for sale in the item catalog; receiving user input that activates the control; in response to receiving the user input that activates the control: capturing a photo of the item; and causing a listing for the item to be submitted to the item catalog, the listing including the photo and contact information for the user of the wearable device.
- Example Clause B the method of Example Clause A, further comprising calculating the threshold value as a predetermined percentage of an average sales price of the item over a predefined period of time.
- Example Clause C the method of Example Clause B, wherein the predetermined percentage is defined by an electronic commerce system that maintains the item catalog.
- Example Clause D the method of Example Clause B, wherein the predetermined percentage is defined by the user of the wearable device.
- Example Clause E the method of Example Clause B, wherein the predetermined percentage is defined for a category of items to which the item belongs.
- Example Clause F the method of any one of Example Clauses A through E, wherein accessing the item catalog to determine the estimated current value is based on a determination that the item has not been used in a predefined period of time.
- Example Clause G the method of any one of Example Clauses A through E, wherein accessing the item catalog to determine the estimated current value is based on a determination that the item has been used a number of times in a predefined period of time, the number of times being less than a threshold number of times established for infrequent use.
- Example Clause H a device comprising: one or more processors; and a memory in communication with the one or more processors, the memory having computer-readable instructions stored thereupon which, when executed by the one or more processors, cause the one or more processors to: scan a real-world environment to obtain image data; recognize, based on the image data, an item that exists in the real-world environment; add the item to an inventory of items associated with the real-world environment; access an item catalog to determine an estimated current value for the item; determine that the estimated current value is more than a threshold value; and in response to determining that the estimated current value is more than the threshold value, display the estimated current value in association with the item on a display device.
- Example Clause I the device of Example Clause H, wherein the computer-readable instructions further cause the one or more processors to calculate the threshold value as a predetermined percentage of an average sales price of the item over a predefined period of time.
- Example Clause J the device of Example Clause I, wherein the predetermined percentage is defined by an electronic commerce system that maintains the item catalog.
- Example Clause K the device of Example Clause I, wherein the predetermined percentage is defined by the user of the wearable device.
- Example Clause L the device of Example Clause I, wherein the predetermined percentage is defined for a category of items to which the item belongs.
- Example Clause M the device of any one of Example Clauses H through L, wherein accessing the item catalog to determine the estimated current value is based on a determination that the item has not been used in a predefined period of time.
- Example Clause N the device of any one of Example Clauses H through M, wherein the threshold value is adjusted based on contextual data, the contextual data comprising one or more of a frequency of use of the item, a condition of the item, or a sales history for a user of the device.
- Example Clause O the device of Example Clause H, wherein the threshold value is defined by a user setting of the device.
- Example Clause P the device of any one of Example Clauses H through O, wherein the computer-readable instructions further cause the one or more processors to: configure a control that enables a user to list the item for sale in the item catalog; receive user input that activates the control; and in response to receiving the user input that activates the control cause a listing for the item to be submitted to the item catalog, the listing including a portion of the image data that captures the item and contact information for the user.
- Example Clause Q a method comprising: recognizing, based on image data obtained by a wearable device, an item that exists in the real-world environment; adding the item to an inventory of items associated with the real-world environment; accessing, by one or more processors, information associated with an item catalog to determine that a number of sales of the item during a recent predefined period of time is greater than a threshold number of sales; based on the number of sales of the item during the recent predefined period of time being greater than the threshold number of sales, determining an estimated current value for the item; and displaying the estimated current value in association with the item on a display device of the wearable device.
- Example Clause R the method of Example Clause Q, wherein the threshold number of sales is established as a predetermined percentage of a number of sales of the item during a predefined period of time that precedes the recent predefined period of time.
- Example Clause S the method of Example Clause Q or Example Clause R, further comprising: receiving user input that instructs the wearable device to list the item for sale in the catalog of items; in response to receiving the user input: capturing a photo of the item; and causing a listing for the item to be submitted to the catalog of items, the listing including the photo and contact information for the user.
- Example Clause T the method of any one of Example Clause Q through S, further comprising receiving user input to enter an operation mode that enables the estimated current value to be displayed.
- any reference to “first,” “second,” etc. users or other elements within the Summary and/or Detailed Description is not intended to and should not be construed to necessarily correspond to any reference of “first,” “second,” etc. elements of the claims. Rather, any use of “first” and “second” within the Summary and/or Detailed Description may be used to distinguish between two different instances of the same element (e.g., two different users, two different items, etc.).
Landscapes
- Business, Economics & Management (AREA)
- Engineering & Computer Science (AREA)
- Accounting & Taxation (AREA)
- Finance (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Development Economics (AREA)
- Strategic Management (AREA)
- Economics (AREA)
- General Business, Economics & Management (AREA)
- Marketing (AREA)
- Entrepreneurship & Innovation (AREA)
- Game Theory and Decision Science (AREA)
- General Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Software Systems (AREA)
- Human Computer Interaction (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Computer Hardware Design (AREA)
- Computer Graphics (AREA)
- Data Mining & Analysis (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Mathematical Physics (AREA)
- Quality & Reliability (AREA)
- Human Resources & Organizations (AREA)
- Tourism & Hospitality (AREA)
- Operations Research (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Computation (AREA)
- Medical Informatics (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- Artificial Intelligence (AREA)
- Geometry (AREA)
- Library & Information Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
- This application claims the benefit of and priority to U.S. Provisional Application No. 62/588,189, filed Nov. 17, 2017 and entitled “Augmented Reality, Mixed Reality, and Virtual Reality Experiences,” the entire contents of which are incorporated herein by reference.
- Users often own a large number of items. Some of these items may be valuable items that are marketable and that are capable of providing a return to the users (e.g., other users may have an interest in purchasing the item). In many cases, the users may be unaware of the value of items, and even further, the user may even forget they actually own certain items. Consequently, users often miss out on opportunities to receive a return (e.g., a monetary return) on an item with which they are willing to part. In some cases, this item may be an item that they no longer use or that they rarely use.
- A wearable device has the ability to display virtual content to a user, in an augmented reality (“AR”) environment, as the user goes about a day's activities within a household or another type of environment in which items owned by the user are present. As use of wearable devices becomes more prevalent, it has become difficult to effectively identify information, associated with a user's items, that is of interest to the user and to display relevant virtual content representative of the information. Consequently, the user can spend a considerable amount of time sorting through and/or understanding virtual content that is of no interest, or little interest, to the user. This may unnecessarily utilize computing resources such as processing cycles, memory, and network bandwidth. Moreover, this might result in inadvertent or incorrect user input to the wearable device rendering virtual content in an immersive real-world environment, which can also unnecessarily utilize computing resources such as processing cycles, memory, and network bandwidth.
- It is with respect to these and other technical challenges that the disclosure made herein is presented.
- The techniques described herein identify, render, and display relevant virtual content as a user casually wears a wearable device while performing a day's activities within a household or another type of environment in which items owned by the user are present. The relevant content is content in which the user is interested. By identifying virtual content in which the user is interested, a wearable device avoids the aimless display of various content in which the user is not interested. In this way, the disclosed technologies tangibly improve computing efficiencies with respect to a wide variety of computing resources that would otherwise be consumed and/or utilized by improving human-computer interaction and by reducing the amount of processing cycles and storage required.
- The virtual content can be associated with items that are present in a real-world environment, and the virtual content rendered for display is relevant in the sense that the virtual content is likely to be of interest to the user. For example, the virtual content can include an estimated current value of an item that, for various reasons, has become a popular item. That is, the item is in higher demand than it was previously, and there are more potential buyers of the item than there are potential sellers.
- Aspects of the technologies disclosed herein can be implemented by a wearable device, such as an augmented reality (“AR”) device. For example, a user of such a device might provide input indicating an interest to enter or activate a mode enabling the techniques described herein to be implemented. Moreover, the wearable device may communicate with a system, over a network, to implement the techniques described herein.
- A wearable device described herein can recognize items owned by the user. For instance, the items may be items found in a kitchen of a user's house, an office space at the user's place of work, a workshop in a user's garage, etc. Based on the recognition, the wearable device is configured to create and maintain an inventory of items owned by a user. Accordingly, when the wearable device recognizes an item, the wearable device can check the inventory of items to determine if the recognized item has already been inventoried. If not, the wearable device adds the recognized item to the inventory of items. This process can be implemented over a period of time in order to continuously maintain and update the inventory of items.
- The inventory of items can include some or all items owned by a user. Moreover, the inventory of items can be organized so that the items owned by the user are sorted based on different environments in which they are present (e.g., a home inventory, a work or office inventory, a vacation place inventory, a kitchen inventory, a garage inventory, a master bedroom inventory, a secondary bedroom, etc.).
- A wearable device described herein is further configured to access an item catalog to determine an estimated current value of an item. The item catalog includes current listings of items for sale, and also information associated with transactions during which items are exchanged between a seller and a buyer. This information can include a number of sales of an item, a price for each sale, a buyer identification, a seller identification, item characteristics, etc.
- In one example, the estimated current value of an item owned by a user can be the most recent price at which the same item or a similar item is sold via the item catalog. In another example, the estimated current value of an item owned by a user can be an average price at which the same item and/or similar items are sold using multiple recent sales (e.g., the average price of a predetermined number of sales such as the last five item sales or the last ten item sales). Once the estimated current value is determined, the wearable device can generate and display the estimated current value of the item so the user is made aware of this information. This information can be useful to a user if the user is thinking about and/or willing to sell the item.
- In various embodiments, as a precondition to displaying the estimated current value of the item, the wearable device is configured to compare the estimated current value to a threshold value. In one example, the wearable device may use a policy to establish the threshold value for an item. The policy may calculate the threshold value as a predetermined percentage of an average sales price of an item. The average sales price may be calculated for a predefined period of time (e.g., the last week, the last month, the last three months, etc.). Thus, if the average sales price of the item for the last three months is $100, and the predetermined percentage used to calculate the threshold value is one hundred forty percent (140%), then the threshold value is $140.
- Using the inventory of items, the wearable device is configured to track or check the estimated current value of the item over a period of time as the price fluctuates based on market conditions. If the estimated current value meets or exceeds the threshold value, the wearable device can generate and display a notification informing the user of the recent price increase. The notification can be displayed while the user is located in an environment in which the item is present.
- Consequently, the techniques described herein can inform the user of ownership of a “hot”, or popular, item that is in demand on an electronic commerce (e-commerce) site. Demand for an item can spike in response to an occurrence of a particular event. For example, a recently announced change in a law that will go into effect in the near future may spike demand for an item (e.g., the law may limit sales of the item). Thus, the estimated current value of the item may increase dramatically.
- In another example, the death of a celebrity (e.g., a musician) may spike demand for an item associated with the late celebrity (e.g., old musical records that are no longer produced). Thus, the estimated current value of the item may increase dramatically. In yet another example, a recent accolade or achievement by a celebrity such as a sports star (e.g., MVP of a big game) may spike demand for an item associated with the celebrity (e.g., signed memorabilia such as a football, a jersey, a helmet, etc.). Again, this type of event may cause the estimated current value of the item to increase dramatically. Accordingly, the techniques described herein can help people take advantage of a recent event that causes the value of an item they own to increase dramatically. If a user is willing to sell such an item, the user may realize a monetary return that they may not have realized prior to the event occurring.
- In some instances, the threshold value can be established based on contextual data. For example, the contextual data can comprise a condition of the item. The wearable device can recognize item characteristics that are indicative of the condition of the item (e.g., excellent, good, fair, poor, etc.) and adjust a threshold value based on the condition of the item. More specifically, the threshold value may be increased if the item is in excellent condition (e.g., shows little wear and tear, is not chipped or broken, etc.) and the threshold value may be decreased if the item is in poor condition (e.g., shows a large amount of wear and tear, is chipped or broken, etc.).
- In another example, the contextual data can comprise a sales history for the user. For instance, if a number of previous sales associated with a user account indicate that the user sells items at higher prices and rarely adjusts the prices lower, then the threshold value can be adjusted higher. In contrast, if a number of previous sales associated with a user account indicate that the user sells items at lower prices (e.g., commonly discounts the sales prices to move an item quickly), then the threshold value can be adjusted lower.
- In a further example, the contextual data can comprise a total number of sales for the user over a period of time. That is, the user may have previously indicated that he or she would like to sell a predetermined number of items for a defined period of time (e.g., two items per month). Provided that the user has not sold the predetermined number of items and the predefined time period is about to expire or end, then the threshold value can be adjusted lower so that the user is presented with more selling opportunities as he or she looks around an environment filled with items he or she owns. Furthermore, if the user is close to meeting or has already met the predetermined number of items to sell for the defined period of time, then the threshold value can be adjusted higher.
- In some embodiments, the estimated current value of an item can be determined based on item characteristics shared between different items. That is, an item the user owns may not be the exact same as one or more items used to determine the estimated current value, but they may be similar in that they share one or more item characteristics that contribute to an increase in a value. For example, a shared item characteristic can include a particular manufacturer or producer that is going out of business, and thus, not making items anymore. Consequently, different types of items manufactured or produced by the same company may increase in value.
- In another example, a shared item characteristic can include an authenticated signature of a celebrity (e.g., an estimated current price for a player's signed football helmet can be determined based on a recent sales price of the player's signed jersey). Therefore, the technologies described herein can recognize characteristics of a particular item owned by the user and determine an estimated current value for the particular item based on sales of the same item and/or sales of similar items, where an item is similar if it contains a shared item characteristic that contributes to an increase in value.
- In further embodiments, the wearable device can be configured to determine and/or track use characteristics of an item in the inventory. For example, as part of maintaining an inventory of items, the wearable device can determine a total number of uses of an item, a frequency of use of an item, etc. The total number of uses of an item and/or the frequency of use may be indicative of whether the user is still interested in owning the item. For example, if the user never or rarely uses the item and the item is not a collectible item, then the user likely has little interest in owning the item. Consequently, the user is likely more willing to part with, or sell, the item. Accordingly, using the inventory of items, the wearable device can analyze the total number of uses and/or the frequency of use to determine that an item is rarely used and/or likely no longer of interest to the user. The wearable device can display the estimated current value and recommend that the user sell the item that is rarely used.
- The use of an item may be determined based on whether the item has been moved from one location in an environment to another location in the same environment or in a different environment. Over time, the wearable device can track a number of times an item has been moved. For instance, if the wearable device determines that a user stores or places golf clubs in different locations, then the wearable device can store information indicating that the user still plays golf.
- In some examples, the threshold value described above can be established based on use characteristics (e.g., use characteristics can be part of the contextual data). For instance, if a user never or rarely uses an item, then the threshold value can be adjusted lower. If the user often uses the item (e.g., the user enjoys the item), then the threshold value can be adjusted higher.
- In additional embodiments, the wearable device can be used as an effective mechanism to efficiently list the item for sale. That is, a user can look at an item he or she owns while wearing the wearable device, see an estimated current value, and provide input for the wearable device to capture a photo of the item. In response to the input, the wearable device can create a listing for the item in an e-commerce site.
- In some scenarios, the listing may be posted to a “secondary” marketplace of the e-commerce site that includes items that a user is willing to sell without having to spend a considerable amount of time to complete a full listing for a “primary” marketplace. For example, an item listing in the secondary marketplace is a limited listing that may only include a photo of an item, a seller's identification, and/or piece of contact information (e.g., an email, a phone number, etc.). In contrast, to complete a full listing, an e-commerce system may require a seller to submit much more information such as an item title, an item category, an item description, item specifications, a price, multiple images, etc. before the full listing is posted in an item catalog. Some potential buyers may prefer to browse through the secondary market place to find a photo that captures an item of interest.
- If found, a potential buyer can contact the seller and make an offer to purchase the item. Consequently, by using the wearable device and the secondary marketplace, a minimal amount of work is required by the seller to submit an item listing to an e-commerce site, where the item listing corresponds to an item the user is willing to sell (e.g., due to a recent spike in demand and price) but does not necessarily need to sell.
- The disclosed technologies improve a user experience by identifying relevant opportunities to display virtual content that is of interest to a user in a three-dimensional immersive environment as the user goes about a day's activities. That is, using the techniques described herein, a user can look at an item he or she owns, see an estimated current value, and provide input for the wearable device to submit a listing with a limited amount of information to a marketplace. This can all be done in an efficient manner without interrupting the user's activities. Moreover, the wearable device avoids the aimless display of various content in which the user is not interested. In this way, the disclosed technologies tangibly improve computing efficiencies with respect to a wide variety of computing resources that would otherwise be consumed and/or utilized by improving human-computer interaction and by reducing the amount of processing cycles and storage required by previous solutions. Technical benefits other than those specifically identified herein might also be realized through implementations of the disclosed technologies.
- It should be appreciated that the above-described subject matter can be implemented as a computer-controlled apparatus, a computer-implemented method, a computing device, or as an article of manufacture such as a computer-readable medium. These and various other features will be apparent from a reading of the following Detailed Description and a review of the associated drawings.
- This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended that this Summary be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.
- The Detailed Description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same reference numbers in different figures indicate similar or identical items.
-
FIG. 1 illustrates aspects of an exemplary computing environment in which a wearable device and/or a system can create and maintain an inventory of items owned by a user and access an item catalog to determine an estimated current value of an item owned by the user. -
FIG. 2 illustrates example marketplaces of an item catalog to which a listing for an item can be submitted. -
FIG. 3 illustrates an example where a user is looking at a portion of a real-world environment (e.g., a living room) while wearing a wearable device configured to recognize existing items, add the existing item to an item inventory, determine estimated current values for the existing items, and/or display the estimated current values in association with the existing items. -
FIG. 4 illustrates an example where a user can provide input to submit a listing to a secondary marketplace. -
FIG. 5 is a flow diagram that illustrates an example process describing aspects of the technologies disclosed herein for displaying an estimated current value of an item in a real-world environment via a wearable device. -
FIG. 6 is a flow diagram that illustrates an example process describing aspects of the technologies disclosed herein for listing an item for sale via an item catalog (e.g., an e-commerce site). -
FIG. 7 shows an illustrative configuration of a wearable device capable of implementing aspects of the technologies disclosed herein. -
FIG. 8 illustrates additional details of an example computer architecture for a computer capable of implementing aspects of the technologies described herein. - This Detailed Description describes identifying, rendering, and displaying relevant virtual content as a user casually wears a wearable device while performing a day's activities within a household or another type of environment in which items owned by the user are present. The virtual content can be associated with the items that are present in a real-world environment, and the virtual content rendered for display is relevant in the sense that the virtual content is likely to be of interest to the user.
- A wearable device described herein can recognize items owned by the user. For instance, the items may be items found in a kitchen of a user's house, an office space at the user's place of work, a workshop in a user's garage, etc. Based on the recognition, the wearable device is configured to create and maintain an inventory of items owned by a user. The wearable device can also access an item catalog to determine an estimated current value of an item. Once the estimated current value is determined, the wearable device can generate and display the estimated current value of the item so the user is made aware of this information. This information can be useful to a user if the user is thinking about and/or willing to sell the item.
- Referring now to the FIGURES, technologies for determining and displaying an estimated current value of an item to a user will be described.
-
FIG. 1 illustrates aspects of anexemplary computing environment 100 in which a wearable device and/or a system can create and maintain an inventory of items owned by a user and access an item catalog to determine an estimated current value of an item owned by the user. As illustrated, the exemplary system may comprise an electronic commerce (“e-commerce”)system 102 that includes anitem catalog 104 where users and/or merchants can list real-world items for sale. A real-world item can be any type of item including, but not limited to, electronics, home goods, automobiles or automotive parts, clothing, musical instruments, art, jewelry, and so forth. In various examples, thee-commerce system 102 can be implemented on one or more server computers operating in conjunction with of an e-commerce site. - A
user 106 can utilize awearable device 108, such as that described in further detail below with respect toFIG. 7 , to obtainimage data 110 of the real-world environment in which theuser 106 is currently located. For instance, thewearable device 108 can include an optical device configured to scan the real-world environment of theuser 106 to obtain the image data 110 (e.g., recognize objects in the real-world environment). In various examples, theimage data 110 of the real-world environment includes recognizable existingitems 112 that are physically present in the real-world environment. In some embodiments, thewearable device 108 can send theimage data 110 to thee-commerce system 102 and thee-commerce system 102 can recognize the existingitems 112. - The
wearable device 108 and/or thee-commerce system 102 are configured to create and maintain an inventory of items owned and/or possessed by the user (i.e., the item inventory 114). Theitem inventory 114 can be maintained in association with a user account. Upon recognizing anitem 112, thewearable device 108 is configured to check theitem inventory 114 to determine if the recognized item has already been added. If not, thewearable device 108 adds the recognized item to theitem inventory 114. - In one example, the
user 106 can activate an “item inventory” operation mode for thewearable device 108, and based on the activation, the inventorying process can continually be implemented over a period of time in order to ensure theitem inventory 114 is accurate. Theitem inventory 114 can include all items owned and/or possessed by theuser 106 and/or a group of people associated with the user 106 (e.g., other family members). Moreover, theitem inventory 114 can be organized so that the items owned and/or possessed by theuser 106 are sorted based on different real-world environments in which they are present (e.g., a home inventory, a work or office inventory, a vacation place inventory, a kitchen inventory, a garage inventory, a master bedroom inventory, a secondary bedroom, etc.). - The
wearable device 108, via one or more network(s) 116 and thee-commerce system 102, is further configured to access theitem catalog 104 to determine an estimatedcurrent value 118 of an item in theitem inventory 114. The item may be anitem 112 located in the real-world environment in which theuser 106 is currently located. As described above, theitem catalog 104 includescurrent listings 120 of items for sale via an e-commerce site operated by thee-commerce system 102, for example. Theitem catalog 104 may also storetransaction information 122 associated with exchanges of items between sellers and buyers. Thetransaction information 122 can include a number of sales of an item, a price for each sale, a buyer identification, a seller identification, item characteristics, etc. - The estimated
current value 118 of anitem 112 can be the most recent price at which the same item or a similar item is sold via theitem catalog 104. Alternatively, the estimatedcurrent value 118 of anitem 112 can be an average price at which the same item and/or similar items are sold using multiple recent sales (e.g., the average price of a predetermined number of sales such as the last five item sales or the last ten item sales). - Once the estimated
current value 118 is determined, an inventory value notification tool 124 (e.g., a software component or module) of thewearable device 108 can compare the estimatedcurrent value 118 to athreshold value 126. If the estimated current value 118 (e.g., $160) is greater than the threshold value 126 (e.g., $140), the inventoryvalue notification tool 124 can display the estimatedcurrent value 118 of theitem 112 in the user's view of the real-world environment so theuser 106 is made aware of this information. This information can be useful to auser 106 if theuser 106 is thinking about and/or willing to sell theitem 112. - In some examples, the
user 106 can specifically define thethreshold value 126 at which he or she is willing to sell anitem 112. Based on market conditions that cause the price of theitem 112 to fluctuate (e.g., increase and/or decrease over a period of time), theuser 106 can be notified when the estimatedcurrent value 118 of theitem 112 is more than thethreshold value 126 defined by theuser 106. - In alternative examples, the
threshold value 126 can be established by thee-commerce system 102 and/or thewearable device 108. For instance, thethreshold value 126 can be calculated as a predetermined percentage of an average sales price of an item. The average sales price may be calculated for a predefined period of time (e.g., the last week, the last month, the last three months, etc.). Thus, if the average sales price of the item for the last three months is $100, and the predetermined percentage used to calculate the threshold value is one hundred and forty percent (140%), then the threshold value is $140. - The predetermined percentage is typically more than one hundred percent (e.g., 120%, 150%, 200%, etc.) and can be established by the
e-commerce system 102 and/or thewearable device 108 to determine when a price increase is substantial. Further, thee-commerce system 102 and/or thewearable device 108 can define different predetermined percentages for different categories of items, and use the appropriate predetermined percentage to determine athreshold value 126 based on a category to which theitem 112 belongs. This accounts for the possibility that a substantial price increase for one category of items may be different than a substantial price increase for another category of items. In some instances, theuser 106 can define the predetermined percentage used to calculate thethreshold value 126 for theitems 112 the user owns and/or possesses. - A current estimated
price 118 exceeding thethreshold value 126 may be a condition for displaying relevant virtual content. For example, while wearing thewearable device 108, theuser 106 may not want to know the estimated current prices of all the items in an immersive real-world environment, as this may be distraction. Moreover, this can cause unnecessary consumption of computer resources. Instead, theuser 106 may only desire to see estimated current prices that have recently increased dramatically. - Consequently, the
item 112 for which the estimatedcurrent value 118 is determined and/or displayed may be apopular item 128. Apopular item 128 can be an item that is in high demand due to a recent event. For example, a recently announced change in a law that will go into effect in the near future may spike demand for an item (e.g., the law may limit sales of the item). In another example, the death of a celebrity (e.g., a musician) may spike demand for an item associated with the late celebrity (e.g., old musical records that are no longer produced). In yet another example, a recent accolade or achievement by a celebrity such as a sports star (e.g., MVP of a big game) may spike demand for an item associated with the celebrity (e.g., signed memorabilia such as a football, a jersey, a helmet, etc.). - Based on an occurrence of a recent unexpected event, the price of the
popular item 128 increases because there are many more potential buyers than there are potential sellers. Moreover, thepopular item 128 may have experienced a recent increase in the number of sales, as referenced by 130. - In some instances, the
threshold value 126 can be established based on contextual data. For example, the contextual data can comprise a condition of the item. The wearable device can recognize item characteristics that are indicative of the condition of the item (e.g., excellent, good, fair, poor, etc.) and adjust athreshold value 126 based on the condition of the item. More specifically, thethreshold value 126 may be increased if the item is in excellent condition (e.g., shows little wear and tear, is not chipped or broken, etc.) and thethreshold value 126 may be decreased if the item is in poor condition (e.g., shows a large amount of wear and tear, is chipped or broken, etc.). - In another example, the contextual data can comprise a sales history for the user. For instance, if a number of previous sales associated with a user account indicate that the user sells items at higher prices and rarely adjusts the prices lower, then the
threshold value 126 can be adjusted higher. In contrast, if a number of previous sales associated with a user account indicate that the user sells items at lower prices (e.g., commonly discounts the sales prices to move an item quickly), then thethreshold value 126 can be adjusted lower. - In a further example, the contextual data can comprise a total number of sales for the user over a period of time. That is, the user may have previously indicated that he or she would like to sell a predetermined number of items for a defined period of time (e.g., two items per month). Provided that the user has not sold the predetermined number of items and the predefined time period is about to expire or end, then the
threshold value 126 can be adjusted lower so that the user is presented with more selling opportunities as he or she looks around an environment filled with items he or she owns. Furthermore, if the user is close to meeting or has already met the predetermined number of items to sell for the defined period of time, then thethreshold value 126 can be adjusted higher. - As an alternative to using threshold values to display estimated current values, the
transaction information 122 in theitem catalog 104 can be accessed to determine that a number of sales of the item during a recent predefined period of time (e.g., the last three days, the last week, the last month) is greater than a threshold number of sales of the item. In one example, the threshold number of sales is established as a predetermined percentage of a number of sales of the item during a predefined period of time that precedes the recent predefined period of time (e.g., the three days before the last three days, the week before the last week, the month before the last month). - Similar to the discussion above, the predetermined percentage associated with a number of sales is typically more than one hundred percent (e.g., 120%, 150%, 200%, etc.) so that the
e-commerce system 102 and/or thewearable device 108 can determine when a number of sales dramatically increases from one time period to the next. Thee-commerce system 102 and/or thewearable device 108 can define different predetermined percentages for different categories of items, and use the appropriate predetermined percentage to determine when sales of an item have greatly increased. Accordingly, the inventoryvalue notification tool 124 can display the estimatedcurrent value 118 of theitem 112 in the user's view of the real-world environment based on an increased number of sales of the item. - In some embodiments, the estimated
current value 118 of anitem 112 can be determined based on item characteristics shared between different items. That is,item 112 may not be the exact same as one or more items used to determine the estimatedcurrent value 118, but they may be similar in that they share one or more item characteristics that contribute to an increase in a value. For example, a shared item characteristic can include a particular manufacturer or producer that is going out of business, and thus, not making items anymore. Consequently, different types of items manufactured or produced by the same company may increase in value. - In another example, a shared item characteristic can include an authenticated signature of a celebrity (e.g., an estimated current price for a player's signed football helmet can be determined based on a recent sales price of the player's signed jersey). Therefore, the technologies described herein can recognize characteristics of a particular item owned by the user and determine an estimated current value for the particular item based on sales of the same item and/or sales of similar items, where an item is similar if it contains a shared item characteristic that contributes to an increase in value.
- In additional embodiments, the
wearable device 108 can be configured to determine and/or track use characteristics of anitem 112 and store the user characteristics as item useinformation 132 in theitem inventory 114. For example, thewearable device 108 can determine a total number of uses of anitem 112, a frequency of use of anitem 112, etc. The use of an item may be determined based on whether the item has been moved from one location in an environment to another location in the same environment or in a different environment. The total number of uses of anitem 112 and/or the frequency of use of theitem 112 may be indicative of whether the user is still interested in owning theitem 112. For example, if the user never or rarely uses theitem 112 and theitem 112 is not a collectible item, then the user likely has little interest in owning theitem 112. Consequently, the user is likely more willing to part with, or sell, the item. It follows that the determination and display of the estimatedcurrent value 118 can be based on a determination that theitem 112 has not been used in a predefined period of time (e.g., the last month, the last three months, the last year). Alternatively, the determination and display of the estimatedcurrent value 118 can be based on a determination that theitem 112 has been used a number of times in a predefined period of time (e.g., once in a year), the number of times being less than a threshold number of times established for infrequent use (e.g., five times in a year). - In some examples, the
threshold value 126 described above can be established based on use characteristics (e.g., use characteristics can be part of the contextual data). For instance, if a user never or rarely uses an item, then thethreshold value 126 can be adjusted lower. If the user often uses the item (e.g., the user enjoys the item), then thethreshold value 126 can be adjusted higher. In some embodiments, thethreshold value 126 can be based on a frequency of use of an item, e.g., a frequency that an item is moved from one position to another over the course of a time period. In some embodiments, thethreshold value 126 can be increased or decreased based on contextual data indicating a frequency of use or a frequency of movement of an item. For instance, an increased frequency in use can decrease thethreshold value 126. Alternatively, a decreased frequency in use can increase thethreshold value 126. In some embodiments, a decreased frequency in use can decrease thethreshold value 126. In other embodiments, increased frequency in use can increase thethreshold value 126. -
FIG. 1 further illustrates that thewearable device 108 can include an item listing tool 134 (e.g., a software component or module). Theitem listing tool 134 implements functionality that enables theuser 106 to create a listing for theitem 112 in the item catalog 104 (e.g., on an e-commerce site). For example, theitem listing tool 134 can configure a control that enables auser 106 of thewearable device 108 to provide input instructing thewearable device 108 to create the listing for the item. The control can be a displayed graphical element configured to receive user input (e.g., a user reaches out to virtually touch a selectable menu option), or the control can be configured to receive an audible command as the user input. - Based on the user input, the
item listing tool 134 can locate theitem 112, focus on theitem 112, capture a photo of the item using a camera of thewearable device 108. Theitem listing tool 134 can then submit the listing for theitem 112 along with the photo to thee-commerce system 102 and/or theitem catalog 104. In some examples, the item listing tool can use the previously obtainedimage data 110 to locate theitem 112, and submit the image data corresponding to theitem 112 along with the listing for the item. -
FIG. 2 illustrates example marketplaces of theitem catalog 104 to which thelisting 200 for theitem 112 can be submitted. InFIG. 2 , theitem catalog 104 includes aprimary marketplace 202 that includesfull item listings 204 and asecondary marketplace 206 that includespartial item listings 208. - In various embodiments, a partial item listing 208 in the
secondary marketplace 206 is a limited listing that may only include a photo of anitem 210, a seller's identification along with piece of contact information 212 (e.g., an email, a phone number, etc.), and/or a suggestedsale price 214. In contrast, to place a full item listing 204 in theprimary marketplace 202, thee-commerce system 102 may require a seller to submit much more information such as an item title, an item category, an item description, item specifications, a price, multiple images, etc. before the full listing is posted in an item catalog. Consequently, using the techniques described herein, a user can look at an item he or she owns, see an estimated current value, and provide input for the wearable device to submit a listing with a limited amount of information to a marketplace. This can all be done in an efficient manner without interrupting the user's activities. - An example of a
wearable device 108 can include an augmented reality (“AR”) device. An AR device is a computing device capable of providing a view of the real-world environment within which physical objects are augmented or supplemented by computer-generated (“CG”) sensory input (e.g., sound, video, graphics, etc.). For instance, an AR device might provide a view of the real-world environment with a rendering of virtual content as an overlay. Additional details regarding the configuration and operation of awearable device 108 capable of providing this functionality is provided below with regard toFIG. 7 . In this regard, it is to be appreciated that the virtual content can be displayed in an AR environment, as well as other types of environments, such as mixed reality (“MR”) environments or virtual reality (“VR”) environments. It is also to be appreciated that the configurations disclosed herein are not limited to use with an AR device. Rather, the technologies disclosed herein can be utilized with any type of computing device that can provide a view of a real-world environment. - It is to be further appreciated that the technologies described herein can be implemented on a variety of different types of
wearable devices 108 configured with a variety of different operating systems, hardware components, and/or installed applications. In various configurations, for example, thewearable device 108 can be implemented by the following example wearable devices: GOOGLE GLASS, MAGIC LEAP ONE, MICROSOFT HOLOLENS, META 2, SONY SMART EYEGLASS, HTC VIVE, OCULUS GO, PLAYSTATION VR, or WINDOWS mixed reality headsets. Thus, embodiments of the present disclosure can be implemented in any AR-capable device, which is different than goggles or glasses that obstruct a user's view of real-world objects, e.g., actual reality. The techniques described herein can be device and/or operating system agnostic. -
FIG. 3 illustrates an example 300 where a user 302 (e.g., user 106) is looking at a portion of a real-world environment (e.g., a living room) while wearing a wearable device 304 (e.g., wearable device 108) configured to recognize existing items, add the existing item to an item inventory, determine estimated current values for the existing items, and/or display the estimated current values in association with the existing items. The view into the living room provided via thewearable device 304 comprises a real-world view from the perspective of theuser 302. As shown, the living room includes a piece ofart 306 on a wall and arecord player 308 sitting on a shelf. - As described above, the
wearable device 304 can recognize the piece ofart 306 and therecord player 308 and add the items to an inventory of items owned and/or possessed by the user 302 (e.g., a home inventory of items, a living room inventory of items, etc.). Furthermore, when the items are in the view of theuser 302, as illustrated, thewearable device 304 can access an item catalog to determine estimated current values for the same or similar items. In some examples, thewearable device 304 compares the estimated current values to threshold values, and if the estimated current values are greater than the threshold values, thewearable device 304 displays the estimated current values close to the items. As shown, the piece ofart 306 on the wall has an estimated current value of $599 and therecord player 308 sitting on the shelf has an estimated current value of $199. - The displayed values may aid the
user 302 in making a decision to sell an item. In some examples, thewearable device 302 can configure a control that receives input (e.g. an instruction) from the user to submit an item listing to anitem catalog 104.FIG. 4 illustrates an example 400 where theuser 302 provides anaudible command 402 to “list the piece of art in the secondary marketplace”. In response, thewearable device 304 is configured to capture a photo of the piece ofart 306 and display a prompt 404 for theuser 302 to confirm the listing of theart 306 in thesecondary marketplace 206. Accordingly, thewearable device 302, via a user account, can create a partial item listing that includes the photo of theart 306, an identification of the user, contact information of the user, and/or the estimated current value, and submit the partial item listing to thesecondary marketplace 206. - To implement some of the described techniques on the
wearable device 108, a user may be required to enable a feature and/or enter a particular operation mode. For example, theuser 106 may need to provide permission and/or authorization for thewearable device 108 to implement the described techniques. -
FIGS. 5 and 6 are flow diagrams that each illustrate an example process describing aspects of the technologies presented herein with reference toFIGS. 1-4 . A process is illustrated as a collection of blocks in a logical flow graph, which represent a sequence of operations that can be implemented in hardware, software, or a combination thereof. In the context of software, the blocks represent computer-executable instructions that, when executed by one or more processors, perform the recited operations. - The particular implementation of the technologies disclosed herein is a matter of choice dependent on the performance and other requirements of a computing device such as a wearable device. Accordingly, the logical operations described herein may be referred to variously as states, operations, structural devices, acts, or modules. These states, operations, structural devices, acts, and modules can be implemented in hardware, software (i.e. computer-executable instructions), firmware, in special-purpose digital logic, and any combination thereof. Generally, computer-executable instructions include routines, programs, objects, components, data structures, and the like that perform or implement particular functions. It should be appreciated that more or fewer operations can be performed than shown in the figures and described herein. These operations can also be performed in a different order than those described herein. Other processes described throughout this disclosure shall be interpreted accordingly.
-
FIG. 5 is a flow diagram that illustrates anexample process 500 describing aspects of the technologies disclosed herein for displaying an estimated current value of an item in a real-world environment via a wearable device. - The
process 500 begins atblock 502, where a real-world environment is scanned by a wearable device to obtain image data. Theprocess 500 proceeds to block 504 where the wearable device uses the image data to recognize an item that exists in the real-world environment. Atblock 506, the item is added on an item inventory associated with the real-world environment. Atblock 508, the wearable device accesses an item catalog to determine an estimated current value for the item. - The
process 500 proceeds to block 510 where it is determined that the estimated current value is more than a threshold value. Atblock 512, the estimated current value is displayed in association with the item via a display device of the wearable device. -
FIG. 6 is a flow diagram that illustrates anexample process 600 describing aspects of the technologies disclosed herein for listing an item for sale via an item catalog (e.g., an e-commerce site). - The
process 600 begins atblock 602, where user input that instructs the wearable device to list the item for sale via an item catalog is received. As described above, the instruction can specify that the listing be submitted to the secondary marketplace so the user does not have to interrupt a current activity to complete a full item listing. - The process proceeds to block 604 where the wearable device captures a photo of the item. At
block 606, the wearable device submits the listing for the item to the item catalog. The listing can include the photo of the item, the seller's identification and a piece of contact information, and/or the estimated current value as a suggested sales price. -
FIG. 7 shows an illustrative configuration of a wearable device 700 (e.g., a headset system, a head-mounted display, etc.) capable of implementing aspects of the technologies disclosed herein. Thewearable device 700 includes anoptical system 702 with anillumination engine 704 to generate electro-magnetic (“EM”) radiation that includes both a first bandwidth for generating computer-generated (“CG”) images and a second bandwidth for tracking physical objects. The first bandwidth may include some or all of the visible-light portion of the EM spectrum whereas the second bandwidth may include any portion of the EM spectrum that is suitable to deploy a desired tracking protocol. - In the example configuration, the
optical system 702 further includes anoptical assembly 706 that is positioned to receive the EM radiation from theillumination engine 704 and to direct the EM radiation (or individual bandwidths of thereof) along one or more predetermined optical paths. For example, theillumination engine 704 may emit the EM radiation into theoptical assembly 706 along a common optical path that is shared by both the first bandwidth and the second bandwidth. Theoptical assembly 706 may also include one or more optical components that are configured to separate the first bandwidth from the second bandwidth (e.g., by causing the first and second bandwidths to propagate along different image-generation and object-tracking optical paths, respectively). - The
optical assembly 706 includes components that are configured to direct the EM radiation with respect to one or more components of theoptical assembly 706 and, more specifically, to direct the first bandwidth for image-generation purposes and to direct the second bandwidth for object-tracking purposes. In this example, theoptical system 702 further includes asensor 708 to generate object data in response to a reflected-portion of the second bandwidth, i.e. a portion of the second bandwidth that is reflected off an object that exists within a real-world environment. - In various configurations, the
wearable device 700 may utilize theoptical system 702 to generate a composite view (e.g., from a perspective of auser 106 that is wearing the wearable device 700) that includes both one or more CG images and a view of at least a portion of the real-world environment that includes the object. For example, theoptical system 702 may utilize various technologies such as, for example, AR technologies to generate composite views that include CG images superimposed over a real-world view. As such, theoptical system 702 may be configured to generate CG images via a display panel. The display panel can include separate right eye and left eye transparent display panels. - Alternatively, the display panel can include a single transparent display panel that is viewable with both eyes and/or a single transparent display panel that is viewable by a single eye only. Therefore, it can be appreciated that the technologies described herein may be deployed within a single-eye Near Eye Display (“NED”) system (e.g., GOOGLE GLASS) and/or a dual-eye NED system (e.g., OCULUS RIFT). The
wearable device 700 is an example device that is used to provide context and illustrate various features and aspects of the user interface display technologies and systems disclosed herein. Other devices and systems may also use the interface display technologies and systems disclosed herein. - The display panel may be a waveguide display that includes one or more diffractive optical elements (“DOEs”) for in-coupling incident light into the waveguide, expanding the incident light in one or more directions for exit pupil expansion, and/or out-coupling the incident light out of the waveguide (e.g., toward a user's eye). In some examples, the wearable device 1200 may further include an additional see-through optical component.
- In the illustrated example of
FIG. 7 , acontroller 710 is operatively coupled to each of theillumination engine 704, the optical assembly 706 (and/or scanning devices thereof,) and thesensor 708. Thecontroller 710 includes one or more logic devices and one or more computer memory devices storing instructions executable by the logic device(s) to deploy functionalities described herein with relation to theoptical system 702. Thecontroller 710 can comprise one ormore processing units 712, one or more computer-readable media 714 for storing anoperating system 716 and data such as, for example, image data that defines one or more CG images and/or tracking data that defines one or more object tracking protocols. - The computer-
readable media 714 may further include an image-generation engine 718 that generates output signals to modulate generation of the first bandwidth of EM radiation by theillumination engine 704 and also to control the scanner(s) to direct the first bandwidth within theoptical assembly 706. Ultimately, the scanner(s) direct the first bandwidth through a display panel to generate CG images that are perceptible to a user, such as a user interface. - The computer-
readable media 714 may further include an object-trackingengine 720 that generates output signals to modulate generation of the second bandwidth of EM radiation by theillumination engine 704 and also the scanner(s) to direct the second bandwidth along an object-tracking optical path to irradiate an object. Theobject tracking engine 720 communicates with thesensor 708 to receive the object data that is generated based on the reflected-portion of the second bandwidth. - The
object tracking engine 720 then analyzes the object data to determine one or more characteristics of the object such as, for example, a depth of the object with respect to theoptical system 702, an orientation of the object with respect to theoptical system 702, a velocity and/or acceleration of the object with respect to theoptical system 702, or any other desired characteristic of the object. The components of thewearable device 700 are operatively connected, for example, via a bus 722, which can include one or more of a system bus, a data bus, an address bus, a PCI bus, a Mini-PCI bus, and any variety of local, peripheral, and/or independent buses. - The
wearable device 700 may further include various other components, for example cameras (e.g., camera 724), microphones (e.g., microphone 726), accelerometers, gyroscopes, magnetometers, temperature sensors, touch sensors, biometric sensors, other image sensors, energy-storage components (e.g. battery), a communication facility, a GPS receiver, etc. Furthermore, thewearable device 700 can include one or moreeye gaze sensors 728. In at least one example, aneye gaze sensor 728 is user facing and is configured to track the position of at least one eye of a user. Accordingly, eye position data (e.g., determined via use of eye gaze sensor 728), image data (e.g., determined via use of the camera 724), and other data can be processed to identify a gaze path of the user. That is, it can be determined that the user is looking at a particular section of a hardware display surface, a particular real-world object or part of a real-world object in the view of the user, and/or a rendered object or part of a rendered object displayed on a hardware display surface. - In some configurations, the
wearable device 700 can include anactuator 729. Theprocessing units 712 can cause the generation of a haptic signal associated with a generated haptic effect toactuator 729, which in turn outputs haptic effects such as vibrotactile haptic effects, electrostatic friction haptic effects, or deformation haptic effects.Actuator 729 includes an actuator drive circuit. Theactuator 729 may be, for example, an electric motor, an electro-magnetic actuator, a voice coil, a shape memory alloy, an electro-active polymer, a solenoid, an eccentric rotating mass motor (“ERM”), a linear resonant actuator (“LRA”), a piezoelectric actuator, a high bandwidth actuator, an electroactive polymer (“EAP”) actuator, an electrostatic friction display, or an ultrasonic vibration generator. - In alternate configurations,
wearable device 700 can include one or moreadditional actuators 729. Theactuator 729 is an example of a haptic output device, where a haptic output device is a device configured to output haptic effects, such as vibrotactile haptic effects, electrostatic friction haptic effects, or deformation haptic effects, in response to a drive signal. In alternate configurations, theactuator 729 can be replaced by some other type of haptic output device. Further, in other alternate configurations,wearable device 700 may not includeactuator 729, and a separate device fromwearable device 700 includes an actuator, or other haptic output device, that generates the haptic effects, andwearable device 700 sends generated haptic signals to that device through a communication device. - The processing unit(s) 712, can represent, for example, a CPU-type processing unit, a GPU-type processing unit, a field-programmable gate array (“FPGA”), another class of digital signal processor (“DSP”), or other hardware logic components that may, in some instances, be driven by a CPU. For example, and without limitation, illustrative types of hardware logic components that can be used include Application-Specific Integrated Circuits (“ASICs”), Application-Specific Standard Products (“ASSPs”), System-on-a-Chip Systems (“SOCs”), Complex Programmable Logic Devices (“CPLDs”), etc.
- As used herein, computer-readable media, such as computer-
readable media 714, can store instructions executable by the processing unit(s) 722. Computer-readable media can also store instructions executable by external processing units such as by an external CPU, an external GPU, and/or executable by an external accelerator, such as an FPGA type accelerator, a DSP type accelerator, or any other internal or external accelerator. In various examples, at least one CPU, GPU, and/or accelerator is incorporated in a computing device, while in some examples one or more of a CPU, GPU, and/or accelerator is external to a computing device. - In various examples, the
wearable device 700 is configured to interact, via network communications, with a network device (e.g., a network server or a cloud server) to implement the configurations described herein. For instance, thewearable device 700 may collect data and send the data over network(s) to the network device. The network device may then implement some of the functionality described herein. Subsequently, the network device can cause thewearable device 700 to display an item and/or instruct thewearable device 700 to perform a task. - Computer-readable media can include computer storage media and/or communication media. Computer storage media can include one or more of volatile memory, nonvolatile memory, and/or other persistent and/or auxiliary computer storage media, removable and non-removable computer storage media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules, or other data. Thus, computer storage media includes tangible and/or physical forms of media included in a device and/or hardware component that is part of a device or external to a device, including but not limited to random access memory (“RAM”), static random-access memory (“SRAM”), dynamic random-access memory (“DRAM”), phase change memory (“PCM”), read-only memory (“ROM”), erasable programmable read-only memory (“EPROM”), electrically erasable programmable read-only memory (“EEPROM”), flash memory, rotating media, optical cards or other optical storage media, magnetic storage, magnetic cards or other magnetic storage devices or media, solid-state memory devices, storage arrays, network attached storage, storage area networks, hosted computer storage or any other storage memory, storage device, and/or storage medium that can be used to store and maintain information for access by a computing device.
- In contrast to computer storage media, communication media can embody computer-readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave, or other transmission mechanism. As defined herein, computer storage media does not include communication media. That is, computer storage media does not include communications media consisting solely of a modulated data signal, a carrier wave, or a propagated signal, per se.
- In accordance with examples described herein, the
wearable device 108 can also be configured to use network communications to interact with an e-commerce provider of an electronic marketplace. To implement the electronic marketplace, the e-commerce provider creates and maintains catalog(s) of items. The items can be bought and/or sold by registered users and/or merchants. Accordingly, the e-commerce provider can comprise resources to collect and store information related to an item, to display the information related to the item to a potential buyer, to conduct online auctions of an item, to match a buyer of an item with a seller of the item, to process a transaction, etc. -
FIG. 8 shows additional details of an example computer architecture for a computer capable of executing the functionalities described herein such as, for example, those described with reference toFIGS. 1-7 , or any program components thereof as described herein. Thus, thecomputer architecture 800 illustrated inFIG. 8 illustrates an architecture for a server computer, or network of server computers, or any other type of computing device suitable for implementing the functionality described herein. Thecomputer architecture 800 may be utilized to execute any aspects of the software components presented herein, such as software components for implementing thee-commerce system 102. - The
computer architecture 800 illustrated inFIG. 8 includes a central processing unit 802 (“CPU”), asystem memory 804, including a random-access memory 806 (“RAM”) and a read-only memory (“ROM”) 808, and asystem bus 810 that couples thememory 804 to theCPU 802. A basic input/output system containing the basic routines that help to transfer information between elements within thecomputer architecture 800, such as during startup, is stored in theROM 808. Thecomputer architecture 800 further includes amass storage device 812 for storing anoperating system 814, other data, and one or more application programs. For example, themass storage device 812 may store theitem catalog 104 and/or the user'sitem inventory 114. - The
mass storage device 812 is connected to theCPU 802 through a mass storage controller (not shown) connected to thebus 810. Themass storage device 812 and its associated computer-readable media provide non-volatile storage for thecomputer architecture 800. Although the description of computer-readable media contained herein refers to a mass storage device, such as a solid-state drive, a hard disk or CD-ROM drive, it should be appreciated by those skilled in the art that computer-readable media can be any available computer storage media or communication media that can be accessed by thecomputer architecture 800. - According to various implementations, the
computer architecture 800 may operate in a networked environment using logical connections to remote computers through anetwork 850. Thecomputer architecture 800 may connect to thenetwork 850 through anetwork interface unit 816 connected to thebus 810. It should be appreciated that thenetwork interface unit 816 also may be utilized to connect to other types of networks and remote computer systems. Thecomputer architecture 800 also may include an input/output controller 818 for receiving and processing input from a number of other devices, including a keyboard, mouse, or electronic stylus. Similarly, the input/output controller 818 may provide output to a display screen, a printer, or other type of output device. It should also be appreciated that a computing system can be implemented using the disclosedcomputer architecture 800 to communicate with other computing systems. - It should be appreciated that the software components described herein may, when loaded into the
CPU 802 and executed, transform theCPU 802 and theoverall computer architecture 800 from a general-purpose computing system into a special-purpose computing system customized to facilitate the functionality presented herein. TheCPU 802 may be constructed from any number of transistors or other discrete circuit elements, which may individually or collectively assume any number of states. More specifically, theCPU 802 may operate as a finite-state machine, in response to executable instructions contained within the software modules disclosed herein. These computer-executable instructions may transform theCPU 802 by specifying how theCPU 802 transitions between states, thereby transforming the transistors or other discrete hardware elements constituting theCPU 802. - Encoding the software modules presented herein also may transform the physical structure of the computer-readable media presented herein. The specific transformation of physical structure may depend on various factors, in different implementations of this description. Examples of such factors may include, but are not limited to, the technology used to implement the computer-readable media, whether the computer-readable media is characterized as primary or secondary storage, and the like. For example, if the computer-readable media is implemented as semiconductor-based memory, the software disclosed herein may be encoded on the computer-readable media by transforming the physical state of the semiconductor memory. For example, the software may transform the state of transistors, capacitors, or other discrete circuit elements constituting the semiconductor memory. The software also may transform the physical state of such components in order to store data thereupon.
- As another example, the computer-readable media disclosed herein may be implemented using magnetic or optical technology. In such implementations, the software presented herein may transform the physical state of magnetic or optical media, when the software is encoded therein. These transformations may include altering the magnetic characteristics of particular locations within given magnetic media. These transformations also may include altering the physical features or characteristics of particular locations within given optical media, to change the optical characteristics of those locations. Other transformations of physical media are possible without departing from the scope and spirit of the present description, with the foregoing examples provided only to facilitate this discussion.
- In light of the above, it should be appreciated that many types of physical transformations take place in the
computer architecture 800 in order to store and execute the software components presented herein. It also should be appreciated that thecomputer architecture 800 may include other types of computing devices, including smartphones, embedded computer systems, tablet computers, other types of wearable computing devices, and other types of computing devices known to those skilled in the art. It is also contemplated that thecomputer architecture 800 may not include all of the components shown inFIG. 8 , may include other components that are not explicitly shown inFIG. 8 , or may utilize an architecture completely different than that shown inFIG. 8 . - The following clauses described multiple possible configurations for implementing the features described in this disclosure. The various configurations described herein are not limiting nor is every feature from any given configuration required to be present in another configuration. Any two or more of the configurations may be combined together unless the context clearly indicates otherwise. As used herein in this document “or” means and/or. For example, “A or B” means A without B, B without A, or A and B. As used herein, “comprising” means including listed all features and potentially including addition of other features that are not listed. “Consisting essentially of” means including the listed features and those additional features that do not materially affect the basic and novel characteristics of the listed features. “Consisting of” means only the listed features to the exclusion of any feature not listed.
- The disclosure presented herein also encompasses the subject matter set forth in the following example clauses.
- Example Clause A, a method comprising: obtaining, by a wearable device, image data from an optical device configured to scan a real-world environment; recognizing, based on the image data, an item that exists in the real-world environment; adding the item to an inventory of items associated with the real-world environment; accessing, by one or more processors, an item catalog to determine an estimated current value for the item; determining that the estimated current value is more than a threshold value; in response to determining that the estimated current value is more than the threshold value, displaying the estimated current value in association with the item on a display device of the wearable device; configuring a control that enables a user of the wearable device to list the item for sale in the item catalog; receiving user input that activates the control; in response to receiving the user input that activates the control: capturing a photo of the item; and causing a listing for the item to be submitted to the item catalog, the listing including the photo and contact information for the user of the wearable device.
- Example Clause B, the method of Example Clause A, further comprising calculating the threshold value as a predetermined percentage of an average sales price of the item over a predefined period of time.
- Example Clause C, the method of Example Clause B, wherein the predetermined percentage is defined by an electronic commerce system that maintains the item catalog.
- Example Clause D, the method of Example Clause B, wherein the predetermined percentage is defined by the user of the wearable device.
- Example Clause E, the method of Example Clause B, wherein the predetermined percentage is defined for a category of items to which the item belongs.
- Example Clause F, the method of any one of Example Clauses A through E, wherein accessing the item catalog to determine the estimated current value is based on a determination that the item has not been used in a predefined period of time.
- Example Clause G, the method of any one of Example Clauses A through E, wherein accessing the item catalog to determine the estimated current value is based on a determination that the item has been used a number of times in a predefined period of time, the number of times being less than a threshold number of times established for infrequent use.
- Example Clause H, a device comprising: one or more processors; and a memory in communication with the one or more processors, the memory having computer-readable instructions stored thereupon which, when executed by the one or more processors, cause the one or more processors to: scan a real-world environment to obtain image data; recognize, based on the image data, an item that exists in the real-world environment; add the item to an inventory of items associated with the real-world environment; access an item catalog to determine an estimated current value for the item; determine that the estimated current value is more than a threshold value; and in response to determining that the estimated current value is more than the threshold value, display the estimated current value in association with the item on a display device.
- Example Clause I, the device of Example Clause H, wherein the computer-readable instructions further cause the one or more processors to calculate the threshold value as a predetermined percentage of an average sales price of the item over a predefined period of time.
- Example Clause J, the device of Example Clause I, wherein the predetermined percentage is defined by an electronic commerce system that maintains the item catalog.
- Example Clause K, the device of Example Clause I, wherein the predetermined percentage is defined by the user of the wearable device.
- Example Clause L, the device of Example Clause I, wherein the predetermined percentage is defined for a category of items to which the item belongs.
- Example Clause M, the device of any one of Example Clauses H through L, wherein accessing the item catalog to determine the estimated current value is based on a determination that the item has not been used in a predefined period of time.
- Example Clause N, the device of any one of Example Clauses H through M, wherein the threshold value is adjusted based on contextual data, the contextual data comprising one or more of a frequency of use of the item, a condition of the item, or a sales history for a user of the device.
- Example Clause O, the device of Example Clause H, wherein the threshold value is defined by a user setting of the device.
- Example Clause P, the device of any one of Example Clauses H through O, wherein the computer-readable instructions further cause the one or more processors to: configure a control that enables a user to list the item for sale in the item catalog; receive user input that activates the control; and in response to receiving the user input that activates the control cause a listing for the item to be submitted to the item catalog, the listing including a portion of the image data that captures the item and contact information for the user.
- Example Clause Q, a method comprising: recognizing, based on image data obtained by a wearable device, an item that exists in the real-world environment; adding the item to an inventory of items associated with the real-world environment; accessing, by one or more processors, information associated with an item catalog to determine that a number of sales of the item during a recent predefined period of time is greater than a threshold number of sales; based on the number of sales of the item during the recent predefined period of time being greater than the threshold number of sales, determining an estimated current value for the item; and displaying the estimated current value in association with the item on a display device of the wearable device.
- Example Clause R, the method of Example Clause Q, wherein the threshold number of sales is established as a predetermined percentage of a number of sales of the item during a predefined period of time that precedes the recent predefined period of time.
- Example Clause S, the method of Example Clause Q or Example Clause R, further comprising: receiving user input that instructs the wearable device to list the item for sale in the catalog of items; in response to receiving the user input: capturing a photo of the item; and causing a listing for the item to be submitted to the catalog of items, the listing including the photo and contact information for the user.
- Example Clause T, the method of any one of Example Clause Q through S, further comprising receiving user input to enter an operation mode that enables the estimated current value to be displayed.
- For ease of understanding, the processes discussed in this disclosure are delineated as separate operations represented as independent blocks. However, these separately delineated operations should not be construed as necessarily order dependent in their performance. The order in which the process is described is not intended to be construed as a limitation, and any number of the described process blocks may be combined in any order to implement the process or an alternate process. Moreover, it is also possible that one or more of the provided operations is modified or omitted.
- Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts are disclosed as example forms of implementing the claims.
- The terms “a,” “an,” “the” and similar referents used in the context of describing the invention (especially in the context of the following claims) are to be construed to cover both the singular and the plural unless otherwise indicated herein or clearly contradicted by context. The terms “based on,” “based upon,” and similar referents are to be construed as meaning “based at least in part” which includes being “based in part” and “based in whole” unless otherwise indicated or clearly contradicted by context.
- It should be appreciated that any reference to “first,” “second,” etc. users or other elements within the Summary and/or Detailed Description is not intended to and should not be construed to necessarily correspond to any reference of “first,” “second,” etc. elements of the claims. Rather, any use of “first” and “second” within the Summary and/or Detailed Description may be used to distinguish between two different instances of the same element (e.g., two different users, two different items, etc.).
- Certain configurations are described herein, including the best mode known to the inventors for carrying out the invention. Of course, variations on these described configurations will become apparent to those of ordinary skill in the art upon reading the foregoing description. Skilled artisans will know how to employ such variations as appropriate, and the configurations disclosed herein may be practiced otherwise than specifically described. Accordingly, all modifications and equivalents of the subject matter recited in the claims appended hereto are included within the scope of this disclosure. Moreover, any combination of the above-described elements in all possible variations thereof is encompassed by the invention unless otherwise indicated herein or otherwise clearly contradicted by context.
Claims (20)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/189,776 US20190156377A1 (en) | 2017-11-17 | 2018-11-13 | Rendering virtual content based on items recognized in a real-world environment |
PCT/US2018/061145 WO2019099585A1 (en) | 2017-11-17 | 2018-11-14 | Rendering virtual content based on items recognized in a real-world environment |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201762588189P | 2017-11-17 | 2017-11-17 | |
US16/189,776 US20190156377A1 (en) | 2017-11-17 | 2018-11-13 | Rendering virtual content based on items recognized in a real-world environment |
Publications (1)
Publication Number | Publication Date |
---|---|
US20190156377A1 true US20190156377A1 (en) | 2019-05-23 |
Family
ID=66532446
Family Applications (8)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/189,849 Abandoned US20190156410A1 (en) | 2017-11-17 | 2018-11-13 | Systems and methods for translating user signals into a virtual environment having a visually perceptible competitive landscape |
US16/189,674 Active 2039-03-12 US11080780B2 (en) | 2017-11-17 | 2018-11-13 | Method, system and computer-readable media for rendering of three-dimensional model data based on characteristics of objects in a real-world environment |
US16/189,776 Abandoned US20190156377A1 (en) | 2017-11-17 | 2018-11-13 | Rendering virtual content based on items recognized in a real-world environment |
US16/189,720 Active US10891685B2 (en) | 2017-11-17 | 2018-11-13 | Efficient rendering of 3D models using model placement metadata |
US16/189,817 Active 2039-01-13 US11556980B2 (en) | 2017-11-17 | 2018-11-13 | Method, system, and computer-readable storage media for rendering of object data based on recognition and/or location matching |
US17/102,283 Active US11200617B2 (en) | 2017-11-17 | 2020-11-23 | Efficient rendering of 3D models using model placement metadata |
US17/358,615 Abandoned US20210319502A1 (en) | 2017-11-17 | 2021-06-25 | Method, System and Computer-Readable Media for Rendering of Three-Dimensional Model Data Based on Characteristics of Objects in A Real-World Environment |
US18/064,358 Pending US20230109329A1 (en) | 2017-11-17 | 2022-12-12 | Rendering of Object Data Based on Recognition and/or Location Matching |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/189,849 Abandoned US20190156410A1 (en) | 2017-11-17 | 2018-11-13 | Systems and methods for translating user signals into a virtual environment having a visually perceptible competitive landscape |
US16/189,674 Active 2039-03-12 US11080780B2 (en) | 2017-11-17 | 2018-11-13 | Method, system and computer-readable media for rendering of three-dimensional model data based on characteristics of objects in a real-world environment |
Family Applications After (5)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/189,720 Active US10891685B2 (en) | 2017-11-17 | 2018-11-13 | Efficient rendering of 3D models using model placement metadata |
US16/189,817 Active 2039-01-13 US11556980B2 (en) | 2017-11-17 | 2018-11-13 | Method, system, and computer-readable storage media for rendering of object data based on recognition and/or location matching |
US17/102,283 Active US11200617B2 (en) | 2017-11-17 | 2020-11-23 | Efficient rendering of 3D models using model placement metadata |
US17/358,615 Abandoned US20210319502A1 (en) | 2017-11-17 | 2021-06-25 | Method, System and Computer-Readable Media for Rendering of Three-Dimensional Model Data Based on Characteristics of Objects in A Real-World Environment |
US18/064,358 Pending US20230109329A1 (en) | 2017-11-17 | 2022-12-12 | Rendering of Object Data Based on Recognition and/or Location Matching |
Country Status (5)
Country | Link |
---|---|
US (8) | US20190156410A1 (en) |
EP (1) | EP3711011B1 (en) |
KR (1) | KR102447411B1 (en) |
CN (1) | CN111357029A (en) |
WO (5) | WO2019099591A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10891685B2 (en) | 2017-11-17 | 2021-01-12 | Ebay Inc. | Efficient rendering of 3D models using model placement metadata |
WO2024036510A1 (en) * | 2022-08-17 | 2024-02-22 | Mak Kwun Yiu | System and method for providing virtual premises |
Families Citing this family (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11348147B2 (en) * | 2020-04-17 | 2022-05-31 | At&T Intellectual Property I, L.P. | Facilitation of value-based sorting of objects |
CN106547769B (en) | 2015-09-21 | 2020-06-02 | 阿里巴巴集团控股有限公司 | DOI display method and device |
US11200692B2 (en) * | 2017-08-07 | 2021-12-14 | Standard Cognition, Corp | Systems and methods to check-in shoppers in a cashier-less store |
KR102418992B1 (en) * | 2017-11-23 | 2022-07-11 | 삼성전자주식회사 | Electronic device and the Method for providing Augmented Reality Service thereof |
US10636253B2 (en) * | 2018-06-15 | 2020-04-28 | Max Lucas | Device to execute a mobile application to allow musicians to perform and compete against each other remotely |
US20200065879A1 (en) * | 2018-08-22 | 2020-02-27 | Midea Group Co., Ltd. | Methods and systems for home device recommendation |
JP6589038B1 (en) * | 2018-12-19 | 2019-10-09 | 株式会社メルカリ | Wearable terminal, information processing terminal, program, and product information display method |
US20220124294A1 (en) * | 2019-02-15 | 2022-04-21 | Xliminal, Inc. | System and method for interactively rendering and displaying 3d objects |
WO2020191101A1 (en) * | 2019-03-18 | 2020-09-24 | Geomagical Labs, Inc. | Virtual interaction with three-dimensional indoor room imagery |
WO2020235057A1 (en) * | 2019-05-22 | 2020-11-26 | 日本電気株式会社 | Model generation device, model generation system, model generation method, and non-transitory computer-readable medium |
US11470017B2 (en) * | 2019-07-30 | 2022-10-11 | At&T Intellectual Property I, L.P. | Immersive reality component management via a reduced competition core network component |
CN110717963B (en) * | 2019-08-30 | 2023-08-11 | 杭州群核信息技术有限公司 | Mixed rendering display method, system and storage medium of replaceable model based on WebGL |
US11341558B2 (en) * | 2019-11-21 | 2022-05-24 | Shopify Inc. | Systems and methods for recommending a product based on an image of a scene |
US11145117B2 (en) * | 2019-12-02 | 2021-10-12 | At&T Intellectual Property I, L.P. | System and method for preserving a configurable augmented reality experience |
EP3872770A1 (en) * | 2020-02-28 | 2021-09-01 | Inter Ikea Systems B.V. | A computer implemented method, a device and a computer program product for augmenting a first image with image data from a second image |
US11810595B2 (en) | 2020-04-16 | 2023-11-07 | At&T Intellectual Property I, L.P. | Identification of life events for virtual reality data and content collection |
US11568987B2 (en) | 2020-04-17 | 2023-01-31 | At&T Intellectual Property I, L.P. | Facilitation of conditional do not resuscitate orders |
CN111580670B (en) * | 2020-05-12 | 2023-06-30 | 黑龙江工程学院 | Garden landscape implementation method based on virtual reality |
US12094243B2 (en) * | 2020-05-19 | 2024-09-17 | Board Of Regents, The University Of Texas System | Method and apparatus for discreet person identification on pocket-size offline mobile platform with augmented reality feedback with real-time training capability for usage by universal users |
US20220108000A1 (en) * | 2020-10-05 | 2022-04-07 | Lenovo (Singapore) Pte. Ltd. | Permitting device use based on location recognized from camera input |
IT202100025055A1 (en) * | 2021-09-30 | 2023-03-30 | Geckoway S R L | SCANNING SYSTEM FOR VIRTUALIZING REAL OBJECTS AND RELATIVE METHOD OF USE FOR THE DIGITAL REPRESENTATION OF SUCH OBJECTS |
US20230114462A1 (en) * | 2021-10-13 | 2023-04-13 | Capital One Services, Llc | Selective presentation of an augmented reality element in an augmented reality user interface |
CN114513647B (en) * | 2022-01-04 | 2023-08-29 | 聚好看科技股份有限公司 | Method and device for transmitting data in three-dimensional virtual scene |
US20230418430A1 (en) * | 2022-06-24 | 2023-12-28 | Lowe's Companies, Inc. | Simulated environment for presenting virtual objects and virtual resets |
US20240221060A1 (en) * | 2023-01-03 | 2024-07-04 | International Business Machines Corporation | Selective augmented reality object replacement |
Family Cites Families (89)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5347306A (en) | 1993-12-17 | 1994-09-13 | Mitsubishi Electric Research Laboratories, Inc. | Animated electronic meeting place |
US7937312B1 (en) | 1995-04-26 | 2011-05-03 | Ebay Inc. | Facilitating electronic commerce transactions through binding offers |
US20030126068A1 (en) | 1999-11-18 | 2003-07-03 | Eric Hauk | Virtual trading floor system |
JP2003529851A (en) | 2000-03-31 | 2003-10-07 | プロジェクト・ビー,エルエルシー | Virtual standing room system |
US6813612B1 (en) | 2000-05-25 | 2004-11-02 | Nancy J. Rabenold | Remote bidding supplement for traditional live auctions |
AU2001271864A1 (en) | 2000-07-06 | 2002-01-21 | Raymond Melkomian | Virtual interactive global exchange |
KR20030064244A (en) | 2002-01-24 | 2003-07-31 | 고성민 | Auction method for real-time displaying bid ranking |
US8965460B1 (en) | 2004-01-30 | 2015-02-24 | Ip Holdings, Inc. | Image and augmented reality based networks using mobile devices and intelligent electronic glasses |
US7958047B2 (en) * | 2005-02-04 | 2011-06-07 | The Invention Science Fund I | Virtual credit in simulated environments |
US20080091692A1 (en) | 2006-06-09 | 2008-04-17 | Christopher Keith | Information collection in multi-participant online communities |
US20080147566A1 (en) * | 2006-12-18 | 2008-06-19 | Bellsouth Intellectual Property Corporation | Online auction analysis and recommendation tool |
US20080207329A1 (en) | 2007-02-20 | 2008-08-28 | Andrew Wallace | Method and system of enabling communication activities using bridge between real world and proprietary environments |
US8601386B2 (en) | 2007-04-20 | 2013-12-03 | Ingenio Llc | Methods and systems to facilitate real time communications in virtual reality |
US7945482B2 (en) * | 2007-08-23 | 2011-05-17 | Ebay Inc. | Viewing shopping information on a network-based social platform |
US9111285B2 (en) | 2007-08-27 | 2015-08-18 | Qurio Holdings, Inc. | System and method for representing content, user presence and interaction within virtual world advertising environments |
US8223156B2 (en) | 2008-09-26 | 2012-07-17 | International Business Machines Corporation | Time dependent virtual universe avatar rendering |
US20100125525A1 (en) | 2008-11-18 | 2010-05-20 | Inamdar Anil B | Price alteration through buyer affected aggregation of purchasers |
US20100191770A1 (en) | 2009-01-27 | 2010-07-29 | Apple Inc. | Systems and methods for providing a virtual fashion closet |
US8261158B2 (en) * | 2009-03-13 | 2012-09-04 | Fusion-Io, Inc. | Apparatus, system, and method for using multi-level cell solid-state storage as single level cell solid-state storage |
EP2259225A1 (en) | 2009-06-01 | 2010-12-08 | Alcatel Lucent | Automatic 3D object recommendation device in a personal physical environment |
US20110040645A1 (en) | 2009-08-14 | 2011-02-17 | Rabenold Nancy J | Virtual world integrated auction |
US20110072367A1 (en) | 2009-09-24 | 2011-03-24 | etape Partners, LLC | Three dimensional digitally rendered environments |
US20110270701A1 (en) | 2010-04-30 | 2011-11-03 | Benjamin Joseph Black | Displaying active recent bidders in a bidding fee auction |
US20110295722A1 (en) | 2010-06-09 | 2011-12-01 | Reisman Richard R | Methods, Apparatus, and Systems for Enabling Feedback-Dependent Transactions |
US8749557B2 (en) | 2010-06-11 | 2014-06-10 | Microsoft Corporation | Interacting with user interface via avatar |
US20120084169A1 (en) | 2010-09-30 | 2012-04-05 | Adair Aaron J | Online auction optionally including multiple sellers and multiple auctioneers |
US20120084170A1 (en) | 2010-09-30 | 2012-04-05 | Adair Aaron J | Cumulative point system and scoring of an event based on user participation in the event |
US20120084168A1 (en) | 2010-09-30 | 2012-04-05 | Adair Aaron J | Indication of the remaining duration of an event with a duration recoil feature |
US20120246036A1 (en) | 2011-03-22 | 2012-09-27 | Autonig, LLC | System, method and computer readable medium for conducting a vehicle auction, automatic vehicle condition assessment and automatic vehicle acquisition attractiveness determination |
US10008037B1 (en) | 2011-06-10 | 2018-06-26 | Amazon Technologies, Inc. | User/object interactions in an augmented reality environment |
US9921641B1 (en) | 2011-06-10 | 2018-03-20 | Amazon Technologies, Inc. | User/object interactions in an augmented reality environment |
US9996972B1 (en) | 2011-06-10 | 2018-06-12 | Amazon Technologies, Inc. | User/object interactions in an augmented reality environment |
EP2745236A1 (en) | 2011-08-18 | 2014-06-25 | Layar B.V. | Computer-vision based augmented reality system |
US9449342B2 (en) | 2011-10-27 | 2016-09-20 | Ebay Inc. | System and method for visualization of items in an environment using augmented reality |
US20130159110A1 (en) | 2011-12-14 | 2013-06-20 | Giridhar Rajaram | Targeting users of a social networking system based on interest intensity |
US10223710B2 (en) | 2013-01-04 | 2019-03-05 | Visa International Service Association | Wearable intelligent vision device apparatuses, methods and systems |
US20130257877A1 (en) | 2012-03-30 | 2013-10-03 | Videx, Inc. | Systems and Methods for Generating an Interactive Avatar Model |
US20130297460A1 (en) | 2012-05-01 | 2013-11-07 | Zambala Lllp | System and method for facilitating transactions of a physical product or real life service via an augmented reality environment |
US20130293530A1 (en) | 2012-05-04 | 2013-11-07 | Kathryn Stone Perez | Product augmentation and advertising in see through displays |
US20140040004A1 (en) | 2012-08-02 | 2014-02-06 | Google Inc. | Identifying a deal in shopping results |
US20140058812A1 (en) | 2012-08-17 | 2014-02-27 | Augme Technologies, Inc. | System and method for interactive mobile ads |
US9336541B2 (en) | 2012-09-21 | 2016-05-10 | Paypal, Inc. | Augmented reality product instructions, tutorials and visualizations |
US11397462B2 (en) | 2012-09-28 | 2022-07-26 | Sri International | Real-time human-machine collaboration using big data driven augmented reality technologies |
US9830632B2 (en) | 2012-10-10 | 2017-11-28 | Ebay Inc. | System and methods for personalization and enhancement of a marketplace |
US20140130076A1 (en) | 2012-11-05 | 2014-05-08 | Immersive Labs, Inc. | System and Method of Media Content Selection Using Adaptive Recommendation Engine |
US20140143081A1 (en) | 2012-11-16 | 2014-05-22 | Nextlot, Inc. | Interactive Online Auction System |
US20140164282A1 (en) * | 2012-12-10 | 2014-06-12 | Tibco Software Inc. | Enhanced augmented reality display for use by sales personnel |
US20140172570A1 (en) | 2012-12-14 | 2014-06-19 | Blaise Aguera y Arcas | Mobile and augmented-reality advertisements using device imaging |
US20140214547A1 (en) | 2013-01-25 | 2014-07-31 | R4 Technologies, Llc | Systems and methods for augmented retail reality |
US9870716B1 (en) * | 2013-01-26 | 2018-01-16 | Ip Holdings, Inc. | Smart glasses and smart watches for real time connectivity and health |
US20140279263A1 (en) | 2013-03-13 | 2014-09-18 | Truecar, Inc. | Systems and methods for providing product recommendations |
US20140267228A1 (en) | 2013-03-14 | 2014-09-18 | Microsoft Corporation | Mapping augmented reality experience to various environments |
US20140282220A1 (en) | 2013-03-14 | 2014-09-18 | Tim Wantland | Presenting object models in augmented reality images |
US9953365B2 (en) * | 2013-03-15 | 2018-04-24 | Ten-X, Llc | Virtual online auction forum |
US9286727B2 (en) | 2013-03-25 | 2016-03-15 | Qualcomm Incorporated | System and method for presenting true product dimensions within an augmented real-world setting |
US9390561B2 (en) | 2013-04-12 | 2016-07-12 | Microsoft Technology Licensing, Llc | Personal holographic billboard |
US9904946B2 (en) * | 2013-07-18 | 2018-02-27 | Paypal, Inc. | Reverse showrooming and merchant-customer engagement system |
US20150294385A1 (en) | 2014-04-10 | 2015-10-15 | Bank Of America Corporation | Display of the budget impact of items viewable within an augmented reality display |
US9588342B2 (en) | 2014-04-11 | 2017-03-07 | Bank Of America Corporation | Customer recognition through use of an optical head-mounted display in a wearable computing device |
CN105320931B (en) * | 2014-05-26 | 2019-09-20 | 京瓷办公信息系统株式会社 | Item Information provides device and Item Information provides system |
US9959675B2 (en) | 2014-06-09 | 2018-05-01 | Microsoft Technology Licensing, Llc | Layout design using locally satisfiable proposals |
US20150379460A1 (en) | 2014-06-27 | 2015-12-31 | Kamal Zamer | Recognizing neglected items |
US10438229B1 (en) * | 2014-06-30 | 2019-10-08 | Groupon, Inc. | Systems and methods for providing dimensional promotional offers |
US20160012475A1 (en) | 2014-07-10 | 2016-01-14 | Google Inc. | Methods, systems, and media for presenting advertisements related to displayed content upon detection of user attention |
US9728010B2 (en) | 2014-12-30 | 2017-08-08 | Microsoft Technology Licensing, Llc | Virtual representations of real-world objects |
US20160217157A1 (en) | 2015-01-23 | 2016-07-28 | Ebay Inc. | Recognition of items depicted in images |
US20170061700A1 (en) * | 2015-02-13 | 2017-03-02 | Julian Michael Urbach | Intercommunication between a head mounted display and a real world object |
US20160275723A1 (en) | 2015-03-20 | 2016-09-22 | Deepkaran Singh | System and method for generating three dimensional representation using contextual information |
JP6901412B2 (en) | 2015-06-24 | 2021-07-14 | マジック リープ, インコーポレイテッドMagic Leap,Inc. | Augmented reality devices, systems and methods for purchase |
US20170083954A1 (en) * | 2015-08-10 | 2017-03-23 | Reviews From Friends, Inc. | Obtaining Referral Using Customer Database |
US10311493B2 (en) | 2015-09-15 | 2019-06-04 | Facebook, Inc. | Managing commerce-related communications within a social networking system |
US10049500B2 (en) * | 2015-09-22 | 2018-08-14 | 3D Product Imaging Inc. | Augmented reality e-commerce for home improvement |
US10497043B2 (en) * | 2015-09-24 | 2019-12-03 | Intel Corporation | Online clothing e-commerce systems and methods with machine-learning based sizing recommendation |
US20170256096A1 (en) | 2016-03-07 | 2017-09-07 | Google Inc. | Intelligent object sizing and placement in a augmented / virtual reality environment |
US10395435B2 (en) * | 2016-04-04 | 2019-08-27 | Occipital, Inc. | System for multimedia spatial annotation, visualization, and recommendation |
US10163271B1 (en) | 2016-04-04 | 2018-12-25 | Occipital, Inc. | System for multimedia spatial annotation, visualization, and recommendation |
US10356028B2 (en) | 2016-05-25 | 2019-07-16 | Alphabet Communications, Inc. | Methods, systems, and devices for generating a unique electronic communications account based on a physical address and applications thereof |
US10134190B2 (en) | 2016-06-14 | 2018-11-20 | Microsoft Technology Licensing, Llc | User-height-based rendering system for augmented reality objects |
US20180006990A1 (en) * | 2016-06-30 | 2018-01-04 | Jean Alexandera Munemann | Exclusive social network based on consumption of luxury goods |
US10068379B2 (en) | 2016-09-30 | 2018-09-04 | Intel Corporation | Automatic placement of augmented reality models |
US10332317B2 (en) | 2016-10-25 | 2019-06-25 | Microsoft Technology Licensing, Llc | Virtual reality and cross-device experiences |
EP3532987A2 (en) | 2016-10-26 | 2019-09-04 | Orcam Technologies Ltd. | Wearable device and methods for analyzing images and providing feedback |
US10600111B2 (en) * | 2016-11-30 | 2020-03-24 | Bank Of America Corporation | Geolocation notifications using augmented reality user devices |
EP3336805A1 (en) * | 2016-12-15 | 2018-06-20 | Thomson Licensing | Method and device for a placement of a virtual object of an augmented or mixed reality application in a real-world 3d environment |
US11037202B2 (en) | 2016-12-27 | 2021-06-15 | Paypal, Inc. | Contextual data in augmented reality processing for item recommendations |
US20180204173A1 (en) | 2017-01-17 | 2018-07-19 | Fair Ip, Llc | Data Processing System and Method for Rules/Machine Learning Model-Based Screening of Inventory |
CA3005051A1 (en) * | 2017-05-16 | 2018-11-16 | Michael J. Schuster | Augmented reality task identification and assistance in construction, remodeling, and manufacturing |
US10509962B2 (en) | 2017-09-14 | 2019-12-17 | Ebay Inc. | Camera platform incorporating schedule and stature |
US20190156410A1 (en) | 2017-11-17 | 2019-05-23 | Ebay Inc. | Systems and methods for translating user signals into a virtual environment having a visually perceptible competitive landscape |
-
2018
- 2018-11-13 US US16/189,849 patent/US20190156410A1/en not_active Abandoned
- 2018-11-13 US US16/189,674 patent/US11080780B2/en active Active
- 2018-11-13 US US16/189,776 patent/US20190156377A1/en not_active Abandoned
- 2018-11-13 US US16/189,720 patent/US10891685B2/en active Active
- 2018-11-13 US US16/189,817 patent/US11556980B2/en active Active
- 2018-11-14 WO PCT/US2018/061152 patent/WO2019099591A1/en active Application Filing
- 2018-11-14 EP EP18814749.0A patent/EP3711011B1/en active Active
- 2018-11-14 WO PCT/US2018/061151 patent/WO2019099590A1/en active Application Filing
- 2018-11-14 WO PCT/US2018/061145 patent/WO2019099585A1/en active Application Filing
- 2018-11-14 WO PCT/US2018/061154 patent/WO2019099593A1/en active Application Filing
- 2018-11-14 WO PCT/US2018/061139 patent/WO2019099581A1/en unknown
- 2018-11-14 CN CN201880074250.6A patent/CN111357029A/en active Pending
- 2018-11-14 KR KR1020207013835A patent/KR102447411B1/en active IP Right Grant
-
2020
- 2020-11-23 US US17/102,283 patent/US11200617B2/en active Active
-
2021
- 2021-06-25 US US17/358,615 patent/US20210319502A1/en not_active Abandoned
-
2022
- 2022-12-12 US US18/064,358 patent/US20230109329A1/en active Pending
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10891685B2 (en) | 2017-11-17 | 2021-01-12 | Ebay Inc. | Efficient rendering of 3D models using model placement metadata |
US11080780B2 (en) | 2017-11-17 | 2021-08-03 | Ebay Inc. | Method, system and computer-readable media for rendering of three-dimensional model data based on characteristics of objects in a real-world environment |
US11200617B2 (en) | 2017-11-17 | 2021-12-14 | Ebay Inc. | Efficient rendering of 3D models using model placement metadata |
US11556980B2 (en) | 2017-11-17 | 2023-01-17 | Ebay Inc. | Method, system, and computer-readable storage media for rendering of object data based on recognition and/or location matching |
WO2024036510A1 (en) * | 2022-08-17 | 2024-02-22 | Mak Kwun Yiu | System and method for providing virtual premises |
Also Published As
Publication number | Publication date |
---|---|
EP3711011B1 (en) | 2024-08-28 |
US20190156403A1 (en) | 2019-05-23 |
US20190156582A1 (en) | 2019-05-23 |
US20230109329A1 (en) | 2023-04-06 |
WO2019099581A1 (en) | 2019-05-23 |
WO2019099591A1 (en) | 2019-05-23 |
WO2019099593A1 (en) | 2019-05-23 |
US11080780B2 (en) | 2021-08-03 |
KR102447411B1 (en) | 2022-09-28 |
WO2019099585A1 (en) | 2019-05-23 |
CN111357029A (en) | 2020-06-30 |
US20190156393A1 (en) | 2019-05-23 |
US20190156410A1 (en) | 2019-05-23 |
US10891685B2 (en) | 2021-01-12 |
US20210319502A1 (en) | 2021-10-14 |
US20210073901A1 (en) | 2021-03-11 |
KR20200073256A (en) | 2020-06-23 |
EP3711011A1 (en) | 2020-09-23 |
US11200617B2 (en) | 2021-12-14 |
WO2019099590A1 (en) | 2019-05-23 |
US11556980B2 (en) | 2023-01-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20190156377A1 (en) | Rendering virtual content based on items recognized in a real-world environment | |
US11915288B2 (en) | Useful and novel shopping application | |
US11282120B2 (en) | Digital rights management in three-dimensional (3D) printing | |
JP6770034B2 (en) | Visualization of articles using augmented reality | |
US20150095228A1 (en) | Capturing images for financial transactions | |
US20220366689A1 (en) | Insurance inventory and claim generation | |
TW202205173A (en) | Systems and methods for representing user interactions in multi-user augmented reality | |
KR101922713B1 (en) | User terminal, intermediation server, system and method for intermediating optical shop | |
US11195214B1 (en) | Augmented reality value advisor | |
US20160189268A1 (en) | Wearable device for interacting with media-integrated vendors | |
KR20180006446A (en) | System and method for customizable task notification | |
US12008616B2 (en) | Systems and methods for providing an enhanced analytical engine | |
US11270367B2 (en) | Product comparison techniques using augmented reality | |
US20240320736A1 (en) | Systems and methods for exchanging user data | |
CN112352257B (en) | Instant quotation distribution system | |
US11393198B1 (en) | Interactive insurance inventory and claim generation | |
US12142039B1 (en) | Interactive insurance inventory and claim generation | |
US11995232B1 (en) | Systems and methods for responsive user interface based on gaze depth | |
US20240249442A1 (en) | Systems and methods for overlay of virtual object on proxy object | |
WO2023197041A1 (en) | Computer-implemented system and method for providing a digital asset marketplace |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: EBAY INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:YANKOVICH, STEVE;BEACH, DAVID;SIGNING DATES FROM 20181112 TO 20181113;REEL/FRAME:047490/0517 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCV | Information on status: appeal procedure |
Free format text: NOTICE OF APPEAL FILED |
|
STCV | Information on status: appeal procedure |
Free format text: APPEAL BRIEF (OR SUPPLEMENTAL BRIEF) ENTERED AND FORWARDED TO EXAMINER |
|
STCV | Information on status: appeal procedure |
Free format text: EXAMINER'S ANSWER TO APPEAL BRIEF MAILED |
|
STCV | Information on status: appeal procedure |
Free format text: ON APPEAL -- AWAITING DECISION BY THE BOARD OF APPEALS |
|
STCV | Information on status: appeal procedure |
Free format text: BOARD OF APPEALS DECISION RENDERED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION |