US20150124106A1 - Terminal apparatus, additional information managing apparatus, additional information managing method, and program - Google Patents
Terminal apparatus, additional information managing apparatus, additional information managing method, and program Download PDFInfo
- Publication number
- US20150124106A1 US20150124106A1 US14/514,558 US201414514558A US2015124106A1 US 20150124106 A1 US20150124106 A1 US 20150124106A1 US 201414514558 A US201414514558 A US 201414514558A US 2015124106 A1 US2015124106 A1 US 2015124106A1
- Authority
- US
- United States
- Prior art keywords
- additional information
- information
- image
- photographing
- photographed image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims description 11
- 230000003190 augmentative effect Effects 0.000 claims description 26
- 238000010586 diagram Methods 0.000 description 14
- 238000004891 communication Methods 0.000 description 12
- 235000013405 beer Nutrition 0.000 description 4
- 239000000284 extract Substances 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 239000000470 constituent Substances 0.000 description 2
- 238000010276 construction Methods 0.000 description 2
- 239000003550 marker Substances 0.000 description 2
- 241000167854 Bourreria succulenta Species 0.000 description 1
- 230000004308 accommodation Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 235000019693 cherries Nutrition 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000001143 conditioned effect Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 230000005389 magnetism Effects 0.000 description 1
- 238000010079 rubber tapping Methods 0.000 description 1
Images
Classifications
-
- G06K9/3241—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
-
- G06K9/2054—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/20—Scenes; Scene-specific elements in augmented reality scenes
Definitions
- the present disclosure relates to a technique for providing additional information associated with a photographed object. More particularly, the present disclosure relates to a terminal apparatus, an additional information managing apparatus, an additional information managing method used in the same, and a program used in the same.
- an Augmented Reality (AR) technique is used with which additional information which is absent in the reality space is displayed with the additional information being superimposed on the photographed image to thereby provide sense of the augmented reality.
- the augmented reality offers an effect of augmenting a reality environment which a human being perceives by providing virtually the additional information to the real world.
- the display performance of a display panel of the mobile apparatus such as the smartphone or the tablet terminal has been improved.
- a virtual object is displayed for the photographed image by using the augmented reality to thereby cause a user to interact with the virtual object, so that a new user's way of taking pleasure is created.
- the present disclosure has been made in order to solve the problem described above, and it is therefore desirable to provide a technique with which additional information associated with a photographed object can be effectively provided, and more particularly to provide a terminal apparatus, an additional information managing apparatus, an additional information managing method used in the same, and a program used in the same.
- a terminal apparatus including: a photographing portion configured to photograph a subject which is present in a reality space; an additional information acquiring portion configured to acquire additional information which is made to correspond to an object an image of which is recognized within a photographed image; and an additional information storing portion configured to store therein the acquired additional information in relation to either user identification information or photographing apparatus identification information.
- an additional information managing apparatus including: a photographed image acquiring portion configured to acquire a photographed image obtained by photographing a subject which is present in a reality space; an image recognizing portion configured to recognize an image of an object within the photographed image based on feature information on the photographed image; an additional information acquiring portion configured to acquire additional information which is associated with the object thus image-recognized by referring to a table in which the object and the additional information are associated with each other; and a transmitting portion configured to transmit the additional information thus acquired to a terminal apparatus with which the photographed image has been photographed.
- an additional information managing method including: acquiring a photographed image obtained by photographing a subject which is present in a reality space; recognizing an image of an object within the photographed image based on feature information on the photographed image; acquiring additional information which is associated with the object thus image-recognized by referring to a table in which the object and the additional information are associated with each other; and transmitting the additional information thus acquired to a terminal apparatus with which the photographed image has been photographed.
- a program for a computer including: acquiring a photographed image obtained by photographing a subject which is present in a reality space; recognizing an image of an object within the photographed image based on feature information on the photographed image; acquiring additional information which is associated with the object thus image-recognized by referring to a table in which the object and the additional information are associated with each other; and transmitting the additional information thus acquired to a terminal apparatus with which the photographed image has been photographed.
- the additional information which is associated with the photographed object can be effectively provided.
- FIG. 1 is a block diagram showing a configuration of a terminal as a terminal apparatus according to a first embodiment of the present disclosure
- FIG. 2 is a block diagram showing a configuration of a cloud server as an additional information managing apparatus according to a second embodiment of the present disclosure
- FIG. 3 is a diagram explaining a record of an object which is stored in an object database shown in FIG. 2 ;
- FIG. 4A is a diagram explaining a record of additional information which is stored in an additional information database shown in FIG. 2 ;
- FIG. 4B is a diagram explaining a record in which an object ID and an additional information ID are made to correspond to each other;
- FIG. 5 is a diagram explaining a record in which a user ID/apparatus ID, and an additional information ID which are stored in a user database shown in FIG. 2 are made to correspond to each other;
- FIG. 6 is a sequence diagram explaining a flow of provision and registration of the additional information by the terminal shown in FIG. 1 , and the cloud server shown in FIG. 2 ;
- FIG. 7A is a view explaining a photographed image displayed on a touch panel of a tablet terminal
- FIG. 7B is a view explaining an object specified within the photographed image
- FIG. 7C is a view explaining additional information superimposed on the object within the photographed image
- FIG. 7D is a view explaining an example in which a coupon is displayed as the additional information in a position of the object within the photographed image
- FIG. 8A is a view explaining an object specified within a photographed image
- FIG. 8B is a view explaining an icon representing that a coupon is provided for an object within the photographed image
- FIG. 8C is a view explaining a coupon which is displayed when the icon shown in FIG. 8B is selected.
- FIG. 8D is a view explaining an example in which a plurality of coupons are displayed for the object within the photographed image.
- FIG. 1 is a block diagram showing a configuration of a terminal 100 as a terminal apparatus according to a first embodiment of the present disclosure.
- the terminal 100 is an electronic apparatus having a photographing function and a communication function.
- the terminal 100 includes a mobile phone, a smartphone, a tablet terminal or a camera.
- a photographing portion 10 is a camera having a Charge Coupled Devices (CCD) image sensor, and photographs a subject which is present in a reality space to thereby store information on a photographed image in an image storing portion 20 .
- a display portion 28 displays the photographed image the information on which is stored in the image storing portion 20 on a display device.
- CCD Charge Coupled Devices
- a photographing position acquiring portion 12 has a position sensor such as a Global Positioning System (GPS) receiver.
- the photographing position acquiring portion 12 acquires information on a photographing position and supplies the information thus acquired to an additional information requesting portion 18 .
- a photographing date-and-time acquiring portion 14 is a built-in clock.
- the photographing date-and-time acquiring portion 14 acquires information on the date and time of the photographing and supplies the information thus acquired to the additional information requesting portion 18 .
- the photographing date-and-time acquiring portion 14 may acquire the information on the date and time of the photographing from the GPS receiver.
- a photographing direction acquiring portion 16 has any one of a terrestrial magnetism sensor, a gyro sensor, an acceleration sensor, and an angular acceleration sensor, or a combination thereof.
- the photographing direction acquiring portion 16 acquires information on a photographing direction of the camera built in the terminal 100 by detecting a direction or an inclination of the terminal 100 , and supplies the information thus acquired to the additional information requesting portion 18 .
- the additional information requesting portion 18 transmits the information on the photographed image which is stored in the image storing portion 20 together with an additional information request instruction to a cloud server 200 through a communication portion 32 .
- the additional information requesting portion 18 may transmit at least one of the information on the photographing position, the information on the date and time of the photographing, and the information on the photographing direction together with the additional information request instruction to the cloud server 200 through the communication portion 32 .
- the communication portion 32 is connected to the network through the wireless communication, and transmits the data to the cloud server 200 and receives the data from the cloud server 200 .
- the cloud server 200 recognizes an image of a predetermined object from the photographed image.
- the cloud server 200 may use at least one of the information on the photographing position, the information on the date and time of the photographing, and the information on the photographing direction in combination with the photographed image.
- the cloud server 200 may specify an object which will be seen in a point-of-view position and a line-of-sight direction by combining the information on the photographing position with the information on the photographing direction without using the photographed image.
- An example of the object includes a specific scene, building, signboard, and person.
- the information on the object is previously registered together with the feature information on the image of the object in a database.
- the cloud server 200 acquires the additional information associated with the object the image of which is recognized, and transmits that additional information to the terminal 100 .
- An example of the additional information includes a coupon which can be exchanged with a predetermined item, a digital item such as a character or a weapon which can be utilized in a game, and electronic money.
- An additional information acquiring portion 22 receives the additional information associated with the object the image of which is recognized within the photographed image from the cloud server 200 through the communication portion 32 .
- the additional information acquiring portion 22 supplies the additional information which has been acquired from the cloud server 200 to an augmented reality creating portion 26 .
- the augmented reality creating portion 26 executes processing for superimposing the additional information as the augmented reality on the photographed image the information on which is stored in the image storing portion 20 .
- the display portion 28 reads out the information on the photographed image on which the augmented reality is superimposed by the augmented reality creating portion 26 from the image storing portion 20 , and displays the photographed image concerned on the display device of the display portion 28 .
- a user interface portion 24 receives a user manipulation for a manipulation button of the terminal 100 , or a user manipulation carried out by directly contacting a touch panel, and supplies manipulation contents thereof to the augmented reality creating portion 26 .
- the augmented reality creating portion 26 gives the augmented reality a change in accordance with the user manipulation. The user can select the additional information superimposed on the photographed image through the user interface portion 24 .
- the additional information acquiring portion 22 stores the additional information thus selected in an additional information storing portion 30 .
- the additional information storing portion 30 may be an Integrated Circuit (IC) chip in which an IC for carrying out recording and arithmetic operation of data is incorporated.
- the additional information thus selected is associated with user identifying information or photographing apparatus identifying information and is enciphered with an encryption key to thereby be safely recorded in a security area within the IC chip. Recording the additional information in the IC chip can prevent the forgery from being carried out.
- IC Integrated Circuit
- the user interface portion 24 transmits the instruction to select the additional information issued from the user to the cloud server 200 through the communication portion 32 .
- the cloud server 200 associates the additional information thus selected with either the user identifying information or the photographing apparatus identifying information to thereby record the resulting additional information in a user database.
- the user database is a secure database which is allowed to be accessed when either the user authentication or the apparatus authentication succeeds by using the encryption key or the like. Thus, in the user database, the additional information is safely managed every user or photographing apparatus.
- the additional information storing portion 30 is the IC chip
- the IC chip is held up over an IC card reader, whereby the additional information which has been safely kept in the IC chip is safely read out to the IC card reader. Since either the user authentication or the apparatus authentication is carried out by using the encryption key when the IC card reader reads out the additional information which has been safely kept within the IC chip, the additional information is safely taken out from the IC chip.
- the additional information is the coupon which can be exchanged with some item
- the user can read out the coupon by using the IC card reader to thereby exchange the coupon thus read out with the item.
- the display portion 28 can display the additional information stored in the additional information storing portion 30 on the display device separately from the information on the photographed image in accordance with a request made from the user interface portion 24 .
- the user can utilize the additional information by using a method of, for example, not displaying the additional information as the augmented reality, but displaying the additional information as the coupon or the ticket on the display device of the display portion 28 , bringing up the coupon or ticket to a store, or causing a reader to read the additional information.
- FIG. 2 is a block diagram showing a configuration of the cloud server 200 as an additional information managing apparatus according to a second embodiment of the present disclosure.
- a photographed image acquiring portion 50 a photographing position acquiring portion 52 , a photographing date-and-time acquiring portion 54 , and a photographing direction acquiring portion 56 acquire the information on the photographed image, the information on the photographing position, the information on the date and time of the photographing, and the information on the photographing direction, respectively, from the terminal 100 through a communication portion 70 and supply these pieces of information to an image recognizing portion 58 .
- the image recognizing portion 58 extracts feature information of the subject in the photographed image, and takes matching of the feature information thus extracted with the feature information of the object which is registered in an object database 64 to thereby specify the object.
- FIG. 3 is a diagram explaining a record 80 of the object which is stored in the object database 64 .
- this record 80 the feature information on the object, a thumbnail image of the object, the position information, the date-and-time information, and the direction information are made to correspond to an object ID.
- the position information is latitude and longitude information on a position where the object is present.
- the date-and-time information is used when a specific date and specific time are specified for the object.
- the objects which are different from one another in season or time zone are given with different object IDs to be handled as different objects, respectively.
- the direction information is used when the line-of-sight direction in which the object is looked at is specified.
- the objects which are different from one another in the line-of-sight direction are given with different object IDs to be handled as different objects, respectively.
- the image recognizing portion 58 may combine at least one of the information on the photographing position, the information on the date and time of the photographing, and the information on the photographing direction with the feature information of the subject in the photographed image. Then, the image recognizing portion 58 may take the matching of the at least one thus combined with the record of the object which is registered in the object database 64 to thereby specify the object. For example, the image recognizing portion 58 may specify the object for which the feature information of the subject in the photographed image coincides with the information on the photographing position from the record of the object database 64 . Alternatively, the image recognizing portion 58 may specify the object for which the information on the photographing position coincides with the information on the photographing direction without using the feature information of the subject in the photographed image.
- condition under which the image recognizing portion 58 specifies the object includes:
- condition (1) Although the condition is simple, even when a third party copies the photographed image, the third party can get the coupon.
- the coupon can be prevented from being gotten by the third party.
- condition (3) when the object is photographed from the specific photographing direction, the coupon can be made to be provided.
- condition (4) if the condition that the camera is pointed at the specified photographing place in the specified line-of-sight direction without using the photographed image itself is fulfilled, then, the coupon is applied. Since the photographed image itself is not conditioned, the transmission and image processing of the image data can be omitted, and the efficiency of the processing can be increased.
- the image recognizing portion 58 gives an additional information acquiring portion 60 the specified object ID.
- the additional information acquiring portion 60 acquires the additional information ID associated with the object ID by referring to an additional information database 66 .
- the additional information acquiring portion 60 acquires the additional information associated with the additional information ID from the additional information database 66 , and gives an additional information registering portion 62 the additional information thus acquired.
- FIG. 4A is a diagram explaining a record 82 of the additional information stored in the additional information database 66 .
- FIG. 4B is a diagram explaining a record 84 in which the object ID and the additional information ID are made to correspond to each other.
- the image, the text, and the sound which represent the contents of the additional information are associated with the additional information ID.
- the additional information ID is made to correspond to the object ID.
- the ID of the additional information such as the coupon is made to correspond to the ID of the object such as the specific building or signboard.
- the additional information acquiring portion 60 acquires the additional information ID made to correspond to the object ID specified by the image recognizing portion 58 by referring to the additional information database 66 .
- the additional information acquiring portion 60 further acquires the additional information such as the image, the text, and the sound associated with the additional information ID, and gives the additional information registering portion 62 the additional information thus acquired.
- the additional information acquiring portion 60 transmits the additional information thus acquired to the terminal 100 through the communication portion 70 .
- the image recognizing portion 58 transmits the thumbnail image of the object associated with the specified object ID to the terminal 100 through the communication portion 70 .
- the terminal 100 displays the additional information acquired from the cloud server 200 on the display device of the display portion 28 with the additional information being superimposed on the photographed image. If necessary, the terminal 100 displays the thumbnail image of the specified object as well together with the additional information on the display device of the display portion 28 .
- the additional information registering portion 62 registers the additional information ID acquired from the additional information acquiring portion 60 in a user database 68 with the additional information ID thus acquired being associated with either the user ID or the apparatus ID.
- FIG. 5 is a diagram explaining a record 86 in which the user ID/apparatus ID stored in the user database 68 , and the additional information ID are made to correspond to each other.
- the record 86 is used to manage the acquired additional information ID every either user or apparatus.
- the record is registered in the user database 68 in such a manner to thereby disenable the same additional information to be acquired twice by the same user or in the same apparatus.
- FIG. 6 is a sequence diagram explaining the provision of the additional information, and the flow of the registration by the terminal 100 of the first embodiment, and the cloud server 200 of the second embodiment.
- the photographing portion 10 photographs the reality space (S 10 ).
- the additional information requesting portion 18 transmits the additional information request together with the information on the photographed image to the cloud server 200 (S 12 ).
- the image recognizing portion 58 specifies the object from the photographed image based on the image recognition (S 14 ).
- the additional information acquiring portion 60 acquires the additional information associated with the object the image of which is recognized (S 16 ).
- the additional information acquiring portion 60 transmits the additional information thus acquired to the terminal 100 (S 18 ).
- the additional information acquiring portion 22 receives the additional information from the cloud server 200 .
- the augmented reality creating portion 26 subjects the fact that the additional information is present in the object of the photographed image to the augmented reality, and displays the resulting information on the display device of the display portion 28 (S 20 ).
- the user interface portion 24 receives the instruction to select the additional information which is displayed in the photographed image from the user (S 22 ).
- the additional information selected by the user interface portion 24 is registered in a security area of the IC chip (S 26 ).
- the user interface portion 24 transmits the request to register the additional information selected by the user interface portion 24 to the cloud server 200 (S 28 ).
- the additional information registering portion 62 registers the additional information selected by the user in the user database 68 with the additional information being associated with either the user ID or the apparatus ID (S 30 ).
- FIGS. 7A to 7D are respectively views explaining examples in each of which the additional information is displayed as the augmented reality in the object in the photographed image.
- FIG. 7A is a view explaining a photographed image which is displayed on a touch panel 510 of the tablet terminal 500 .
- the image captured with the built-in camera of the tablet terminal 500 is displayed on the touch panel 510 .
- the user manipulates a manipulation button 520 or directly touches the touch panel 510 to thereby enable he/she to interact with the augmented reality which is being displayed on the touch panel 510 .
- the user carries out the photographing by pointing the built-in camera of the tablet terminal 500 at a building having a signboard of “ABC beer” mounted thereto.
- the information on the photographed image is transmitted to the cloud server 200 by using the communication function of the tablet terminal 500 .
- the cloud server 200 recognizes the image of the object from the image which has been captured with the tablet terminal 500 .
- the information on the signboard 552 of the ABC beer is registered in the object database 64 .
- the image recognizing portion 58 extracts the signboard 552 as the object from the photographed image.
- the cloud server 200 transmits the information which specifies the image area of the extracted object to the tablet terminal 500 .
- FIG. 7B is a view explaining the object which is specified within the photographed image.
- the tablet terminal 500 carries out highlighting for, for example, surrounding the surround of the signboard 552 by a frame 550 indicated by a dotted line by using the information which specifies the image area of the extracted object. Thus, the tablet terminal 500 informs the user of that the signboard 552 is extracted as the object.
- the additional information acquiring portion 60 acquires the additional information associated with the extracted object from the additional information database 66 , and transmits the additional information thus acquired to the tablet terminal 500 .
- FIG. 7C is a view explaining the additional information which is superimposed on the object within the photographed image.
- the augmented reality creating portion 26 creates an image showing a situation 554 that a page is turned so as for the user to understand that the additional information is provided on the signboard 552 as the augmented reality.
- FIG. 7D is a view explaining an example in which the coupon is displayed as the additional information in the position of the object within the photographed image.
- the user interface portion 24 has received an operation for tapping on the signboard 552 from the user, the page is turned, and thus a coupon 555 is displayed as the additional information in the position of the signboard 552 .
- the coupon 555 is displayed which can be exchanged with a service with which the beer is all you can drink in a beer garden on a roof of this building.
- the user can preserve the information on the coupon 555 in the security area of the IC chip built in the tablet terminal 500 , or can transmit the information on the coupon 555 to the cloud server 200 to thereby register therein the information on the coupon 555 with the information on the coupon 555 being associated with either the user ID or the apparatus ID.
- FIGS. 8A to 8D are respectively views explaining different examples in each of which the additional information is displayed as the augmented reality in the object in the photographed image.
- a photographed image of a scene containing therein an image of the Tokyo Tower 532 is displayed on the touch panel 510 of the tablet terminal 500 .
- the image recognizing portion 58 of the cloud server 200 extracts the image of the Tokyo Tower 532 as the object based on the feature information from the photographed image. Also, a frame 530 representing that the image of the Tokyo Tower 532 is extracted as the object is displayed on the touch panel 510 .
- the additional information acquiring portion 60 of the cloud server 200 acquires the additional information (the coupon in this case) associated with the extracted object.
- the additional information the coupon in this case
- FIG. 8B in the touch panel 510 of the tablet terminal 500 , an icon 534 representing that there is the coupon in the extracted object is displayed as the augmented reality on the image of the Tokyo Tower 532 so as to be superimposed thereon.
- a coupon 540 is displayed within the photographed image.
- the coupon 540 is a premium ticket with which a hotel accommodation plan with the Tokyo Tower observatory ticket is provided in 20% OFF.
- the information on the coupon 540 is stored in the security area within the IC chip.
- the image recognizing portion 58 when the image recognizing portion 58 takes the matching between the object the information on which is registered in the object database 64 , and the subject of the photographed image based on the feature information, a plurality of objects are found as candidates in some cases. For example, when the Eiffel Tower having the feature similar to that of the Tokyo Tower is found as the candidate of the object, the image recognizing portion 58 gives the additional information acquiring portion 60 both of the image of the Tokyo Tower and the image of the Eiffel Tower as the candidates of the object. The additional information acquiring portion 60 acquires the additional information associated with the image of the Tokyo Tower and the additional information associated with the image of the Eiffel Tower from the additional information database 66 , and transmits both of the two pieces of additional information to the tablet terminal 500 . In addition, when the plurality of candidates are present as the objects, the image recognizing portion 58 acquires thumbnail images of the candidates of the objects from the object database 64 , and transmits these pieces of information on the thumbnail images thus acquired to the tablet terminal 500 .
- the coupon 540 associated with the image of the Tokyo Tower, and a coupon 542 associated with the Eiffel Tower are both displayed on the touch panel 510 of the tablet terminal 500 .
- the contents of the coupon 542 associated with the image of the Eiffel Tower are a valuable coupon with which shopping in Paris can be enjoyed in 10% OFF.
- the thumbnail images of the objects are displayed in the coupons, respectively.
- the contents of the coupon are displayed together with a thumbnail image 541 of the Tokyo Tower are displayed in the coupon 540 associated with the image of the Tokyo Tower.
- the contents of the coupon are displayed together with a thumbnail image 543 of the Eiffel Tower are displayed in the coupon 542 associated with the image of the Eiffel Tower.
- the user can judge whether or not the objects are properly determined by looking at the thumbnail images displayed in the coupons, respectively.
- the coupon in which the thumbnail image of the Tokyo Tower is displayed is proper, the coupon in which the thumbnail image of the Eiffel Tower is displayed is misjudged.
- the photographing position acquiring portion 52 of the cloud server 200 may acquire the information on the photographing position from the tablet terminal 500 , and the image recognizing portion 58 may narrow down the candidates of the object based on the photographing position.
- the Eiffel Tower can be weeded out from the candidates of the object.
- a constitution can be adopted such that the probability which is determined based on the image matching or the position information is displayed as a numerical value in the coupon; the coupons are sorted in the order of decreasing the probability; or the coupon having the small probability is displayed small, whereby the user can easily select the proper coupon based on the probability.
- the different coupons may be brought up based on the date and time of the photographing.
- the coupons which are different from one another depending on the time zone may be brought up for the same object in such a way that the coupon of the bus sightseeing of Tokyo is brought up, for the Tokyo Tower which was photographed in the early morning, and the coupon of the dinner is brought up for the Tokyo Tower which was photographed in the evening.
- the coupon of introducing the place famous for cherry blossoms in the suburbs of Tokyo may be brought up for the Tokyo Tower which was photographed in the spring.
- the coupon of introducing the place famous for beautiful autumn leaves in the suburbs of Tokyo may be brought up for the Tokyo Tower which was photographed in the autumn.
- the coupons which are different from one another depending on the color or pattern of the lighting-up may also be brought up.
- the kind or presentation source of the coupon may also be made to differ depending on the illumination in such a way that when the bridge which is lighted up blue is photographed by a company, the coupon which is provided by a certain company is provided, but when the bridge which is lighted up in rainbow color is photographed by the same company, the coupon which is provided by another company is provided.
- the subject which shall be photographed may also be the television picture or the game picture.
- the subject which shall be photographed may also be the television picture or the game picture.
- a specific scene or a commercial message of a television program is photographed with the terminal 100 , and information on an image thereof is transmitted to the cloud server 200
- a coupon is provided as the additional information from the cloud server 200 to the terminal 100 .
- a picture of a specific stage or a specific character in a game is photographed with the terminal 100 , and information on an image thereof is transmitted to the cloud server 200
- a digital item which can be utilized in the game is provided as the additional information from the cloud server 200 to the terminal 100 .
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Software Systems (AREA)
- Computer Hardware Design (AREA)
- Computer Graphics (AREA)
- Library & Information Science (AREA)
- Multimedia (AREA)
- Processing Or Creating Images (AREA)
- User Interface Of Digital Computer (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
Description
- The present disclosure relates to a technique for providing additional information associated with a photographed object. More particularly, the present disclosure relates to a terminal apparatus, an additional information managing apparatus, an additional information managing method used in the same, and a program used in the same.
- When a reality space is photographed with a camera built in a smartphone, a tablet terminal or the like, and the photographed image is displayed on a touch panel, an Augmented Reality (AR) technique is used with which additional information which is absent in the reality space is displayed with the additional information being superimposed on the photographed image to thereby provide sense of the augmented reality. The augmented reality offers an effect of augmenting a reality environment which a human being perceives by providing virtually the additional information to the real world. The display performance of a display panel of the mobile apparatus such as the smartphone or the tablet terminal has been improved. Thus, a virtual object is displayed for the photographed image by using the augmented reality to thereby cause a user to interact with the virtual object, so that a new user's way of taking pleasure is created.
- When a marker such as a two-dimensional bar code is read with a built-in camera of the mobile apparatus, related information on a virtual object or the like which is made to correspond to the marker is displayed as the augmented reality. However, a safety system with which when the user photographs the outside world in the reality community, the additional information is displayed as the augmented reality, and thus the additional information can be practically utilized has not been proposed up to this day.
- The present disclosure has been made in order to solve the problem described above, and it is therefore desirable to provide a technique with which additional information associated with a photographed object can be effectively provided, and more particularly to provide a terminal apparatus, an additional information managing apparatus, an additional information managing method used in the same, and a program used in the same.
- In order to attain the desire described above, according to an embodiment of the present disclosure, there is provided a terminal apparatus including: a photographing portion configured to photograph a subject which is present in a reality space; an additional information acquiring portion configured to acquire additional information which is made to correspond to an object an image of which is recognized within a photographed image; and an additional information storing portion configured to store therein the acquired additional information in relation to either user identification information or photographing apparatus identification information.
- According to another embodiment of the present disclosure, there is provided an additional information managing apparatus including: a photographed image acquiring portion configured to acquire a photographed image obtained by photographing a subject which is present in a reality space; an image recognizing portion configured to recognize an image of an object within the photographed image based on feature information on the photographed image; an additional information acquiring portion configured to acquire additional information which is associated with the object thus image-recognized by referring to a table in which the object and the additional information are associated with each other; and a transmitting portion configured to transmit the additional information thus acquired to a terminal apparatus with which the photographed image has been photographed.
- According to still another embodiment of the present disclosure, there is provided an additional information managing method including: acquiring a photographed image obtained by photographing a subject which is present in a reality space; recognizing an image of an object within the photographed image based on feature information on the photographed image; acquiring additional information which is associated with the object thus image-recognized by referring to a table in which the object and the additional information are associated with each other; and transmitting the additional information thus acquired to a terminal apparatus with which the photographed image has been photographed.
- According to yet another embodiment of the present disclosure, there is provided a program for a computer, including: acquiring a photographed image obtained by photographing a subject which is present in a reality space; recognizing an image of an object within the photographed image based on feature information on the photographed image; acquiring additional information which is associated with the object thus image-recognized by referring to a table in which the object and the additional information are associated with each other; and transmitting the additional information thus acquired to a terminal apparatus with which the photographed image has been photographed.
- It should be noted that what are obtained by translating arbitrary combinations of the above constituent elements and expressions of the present disclosure among a method, an apparatus, a system, a computer program, a data structure, a recording medium, and so forth are also effective as embodiments of the present disclosure.
- As set forth hereinabove, according to the present disclosure, the additional information which is associated with the photographed object can be effectively provided.
-
FIG. 1 is a block diagram showing a configuration of a terminal as a terminal apparatus according to a first embodiment of the present disclosure; -
FIG. 2 is a block diagram showing a configuration of a cloud server as an additional information managing apparatus according to a second embodiment of the present disclosure; -
FIG. 3 is a diagram explaining a record of an object which is stored in an object database shown inFIG. 2 ; -
FIG. 4A is a diagram explaining a record of additional information which is stored in an additional information database shown inFIG. 2 ; -
FIG. 4B is a diagram explaining a record in which an object ID and an additional information ID are made to correspond to each other; -
FIG. 5 is a diagram explaining a record in which a user ID/apparatus ID, and an additional information ID which are stored in a user database shown inFIG. 2 are made to correspond to each other; -
FIG. 6 is a sequence diagram explaining a flow of provision and registration of the additional information by the terminal shown inFIG. 1 , and the cloud server shown inFIG. 2 ; -
FIG. 7A is a view explaining a photographed image displayed on a touch panel of a tablet terminal; -
FIG. 7B is a view explaining an object specified within the photographed image; -
FIG. 7C is a view explaining additional information superimposed on the object within the photographed image; -
FIG. 7D is a view explaining an example in which a coupon is displayed as the additional information in a position of the object within the photographed image; -
FIG. 8A is a view explaining an object specified within a photographed image; -
FIG. 8B is a view explaining an icon representing that a coupon is provided for an object within the photographed image; -
FIG. 8C is a view explaining a coupon which is displayed when the icon shown inFIG. 8B is selected; and -
FIG. 8D is a view explaining an example in which a plurality of coupons are displayed for the object within the photographed image. -
FIG. 1 is a block diagram showing a configuration of aterminal 100 as a terminal apparatus according to a first embodiment of the present disclosure. Theterminal 100 is an electronic apparatus having a photographing function and a communication function. As an example, theterminal 100 includes a mobile phone, a smartphone, a tablet terminal or a camera. - A photographing
portion 10, as an example, is a camera having a Charge Coupled Devices (CCD) image sensor, and photographs a subject which is present in a reality space to thereby store information on a photographed image in animage storing portion 20. Adisplay portion 28 displays the photographed image the information on which is stored in theimage storing portion 20 on a display device. - A photographing
position acquiring portion 12, as an example, has a position sensor such as a Global Positioning System (GPS) receiver. The photographingposition acquiring portion 12 acquires information on a photographing position and supplies the information thus acquired to an additionalinformation requesting portion 18. A photographing date-and-time acquiring portion 14, as an example, is a built-in clock. The photographing date-and-time acquiring portion 14 acquires information on the date and time of the photographing and supplies the information thus acquired to the additionalinformation requesting portion 18. The photographing date-and-time acquiring portion 14 may acquire the information on the date and time of the photographing from the GPS receiver. - A photographing
direction acquiring portion 16, as an example, has any one of a terrestrial magnetism sensor, a gyro sensor, an acceleration sensor, and an angular acceleration sensor, or a combination thereof. The photographingdirection acquiring portion 16 acquires information on a photographing direction of the camera built in theterminal 100 by detecting a direction or an inclination of theterminal 100, and supplies the information thus acquired to the additionalinformation requesting portion 18. - The additional
information requesting portion 18 transmits the information on the photographed image which is stored in theimage storing portion 20 together with an additional information request instruction to acloud server 200 through acommunication portion 32. - In addition, the additional
information requesting portion 18 may transmit at least one of the information on the photographing position, the information on the date and time of the photographing, and the information on the photographing direction together with the additional information request instruction to thecloud server 200 through thecommunication portion 32. - The
communication portion 32, as an example, is connected to the network through the wireless communication, and transmits the data to thecloud server 200 and receives the data from thecloud server 200. - The
cloud server 200 recognizes an image of a predetermined object from the photographed image. In order to more precisely specify the object, thecloud server 200 may use at least one of the information on the photographing position, the information on the date and time of the photographing, and the information on the photographing direction in combination with the photographed image. Alternatively, thecloud server 200 may specify an object which will be seen in a point-of-view position and a line-of-sight direction by combining the information on the photographing position with the information on the photographing direction without using the photographed image. - An example of the object includes a specific scene, building, signboard, and person. The information on the object is previously registered together with the feature information on the image of the object in a database.
- The
cloud server 200 acquires the additional information associated with the object the image of which is recognized, and transmits that additional information to the terminal 100. - An example of the additional information includes a coupon which can be exchanged with a predetermined item, a digital item such as a character or a weapon which can be utilized in a game, and electronic money.
- An additional
information acquiring portion 22 receives the additional information associated with the object the image of which is recognized within the photographed image from thecloud server 200 through thecommunication portion 32. The additionalinformation acquiring portion 22 supplies the additional information which has been acquired from thecloud server 200 to an augmentedreality creating portion 26. - The augmented
reality creating portion 26 executes processing for superimposing the additional information as the augmented reality on the photographed image the information on which is stored in theimage storing portion 20. Thedisplay portion 28 reads out the information on the photographed image on which the augmented reality is superimposed by the augmentedreality creating portion 26 from theimage storing portion 20, and displays the photographed image concerned on the display device of thedisplay portion 28. - A
user interface portion 24 receives a user manipulation for a manipulation button of the terminal 100, or a user manipulation carried out by directly contacting a touch panel, and supplies manipulation contents thereof to the augmentedreality creating portion 26. The augmentedreality creating portion 26 gives the augmented reality a change in accordance with the user manipulation. The user can select the additional information superimposed on the photographed image through theuser interface portion 24. - When the
user interface portion 24 has received an instruction to select the additional information from the user, the additionalinformation acquiring portion 22 stores the additional information thus selected in an additionalinformation storing portion 30. The additionalinformation storing portion 30, as an example, may be an Integrated Circuit (IC) chip in which an IC for carrying out recording and arithmetic operation of data is incorporated. In this case, the additional information thus selected is associated with user identifying information or photographing apparatus identifying information and is enciphered with an encryption key to thereby be safely recorded in a security area within the IC chip. Recording the additional information in the IC chip can prevent the forgery from being carried out. - When the additional
information storing portion 30 does not have the security function like the IC chip, theuser interface portion 24 transmits the instruction to select the additional information issued from the user to thecloud server 200 through thecommunication portion 32. Thecloud server 200 associates the additional information thus selected with either the user identifying information or the photographing apparatus identifying information to thereby record the resulting additional information in a user database. Here, the user database is a secure database which is allowed to be accessed when either the user authentication or the apparatus authentication succeeds by using the encryption key or the like. Thus, in the user database, the additional information is safely managed every user or photographing apparatus. - When the additional
information storing portion 30 is the IC chip, the IC chip is held up over an IC card reader, whereby the additional information which has been safely kept in the IC chip is safely read out to the IC card reader. Since either the user authentication or the apparatus authentication is carried out by using the encryption key when the IC card reader reads out the additional information which has been safely kept within the IC chip, the additional information is safely taken out from the IC chip. When the additional information is the coupon which can be exchanged with some item, the user can read out the coupon by using the IC card reader to thereby exchange the coupon thus read out with the item. - The
display portion 28 can display the additional information stored in the additionalinformation storing portion 30 on the display device separately from the information on the photographed image in accordance with a request made from theuser interface portion 24. As a result, the user can utilize the additional information by using a method of, for example, not displaying the additional information as the augmented reality, but displaying the additional information as the coupon or the ticket on the display device of thedisplay portion 28, bringing up the coupon or ticket to a store, or causing a reader to read the additional information. -
FIG. 2 is a block diagram showing a configuration of thecloud server 200 as an additional information managing apparatus according to a second embodiment of the present disclosure. A photographedimage acquiring portion 50, a photographingposition acquiring portion 52, a photographing date-and-time acquiring portion 54, and a photographingdirection acquiring portion 56 acquire the information on the photographed image, the information on the photographing position, the information on the date and time of the photographing, and the information on the photographing direction, respectively, from the terminal 100 through acommunication portion 70 and supply these pieces of information to animage recognizing portion 58. - The
image recognizing portion 58 extracts feature information of the subject in the photographed image, and takes matching of the feature information thus extracted with the feature information of the object which is registered in anobject database 64 to thereby specify the object. -
FIG. 3 is a diagram explaining arecord 80 of the object which is stored in theobject database 64. In thisrecord 80, the feature information on the object, a thumbnail image of the object, the position information, the date-and-time information, and the direction information are made to correspond to an object ID. - The position information is latitude and longitude information on a position where the object is present. The date-and-time information is used when a specific date and specific time are specified for the object. When even the same object is desired to be distinguished depending on a season or a time zone, the objects which are different from one another in season or time zone are given with different object IDs to be handled as different objects, respectively.
- The direction information is used when the line-of-sight direction in which the object is looked at is specified. When even the same object is desired to be distinguished depending on the looking direction, the objects which are different from one another in the line-of-sight direction are given with different object IDs to be handled as different objects, respectively.
- The
image recognizing portion 58 may combine at least one of the information on the photographing position, the information on the date and time of the photographing, and the information on the photographing direction with the feature information of the subject in the photographed image. Then, theimage recognizing portion 58 may take the matching of the at least one thus combined with the record of the object which is registered in theobject database 64 to thereby specify the object. For example, theimage recognizing portion 58 may specify the object for which the feature information of the subject in the photographed image coincides with the information on the photographing position from the record of theobject database 64. Alternatively, theimage recognizing portion 58 may specify the object for which the information on the photographing position coincides with the information on the photographing direction without using the feature information of the subject in the photographed image. - In summary, although being merely an example, the condition under which the
image recognizing portion 58 specifies the object includes: - (1) the coincidence between the feature information on the subject in the photographed image and the feature information on the object registered in the database;
- (2) the coincidence between the feature information on the subject in the photographed image and the feature information on the object registered in the database, and the coincidence between the information on the photographing position and the information on the position of the object;
- (3) the coincidence between the feature information on the subject in the photographed image and the feature information on the object registered in the database, and the coincidence between the information on the photographing position and the information on the line-of-sight direction of the object; and
- (4) the coincidence between the information on the photographing position and the information on the position of the object registered in the database, and the coincidence between the information on the photographing direction and the information on the line-of-sight direction of the object.
- In the case of the condition (1) described above, although the condition is simple, even when a third party copies the photographed image, the third party can get the coupon. When the information on the photographing position is added to the conditions as with the case of the condition (2), if the third party is not in the photographing place, then, the coupon can be prevented from being gotten by the third party. If the condition (3) is used, when the object is photographed from the specific photographing direction, the coupon can be made to be provided. In the case of the condition (4), if the condition that the camera is pointed at the specified photographing place in the specified line-of-sight direction without using the photographed image itself is fulfilled, then, the coupon is applied. Since the photographed image itself is not conditioned, the transmission and image processing of the image data can be omitted, and the efficiency of the processing can be increased.
- The
image recognizing portion 58 gives an additionalinformation acquiring portion 60 the specified object ID. The additionalinformation acquiring portion 60 acquires the additional information ID associated with the object ID by referring to anadditional information database 66. In addition, the additionalinformation acquiring portion 60 acquires the additional information associated with the additional information ID from theadditional information database 66, and gives an additionalinformation registering portion 62 the additional information thus acquired. -
FIG. 4A is a diagram explaining arecord 82 of the additional information stored in theadditional information database 66. Also,FIG. 4B is a diagram explaining arecord 84 in which the object ID and the additional information ID are made to correspond to each other. - As shown in
FIG. 4A , the image, the text, and the sound which represent the contents of the additional information are associated with the additional information ID. Also, as shown inFIG. 4B , the additional information ID is made to correspond to the object ID. For example, the ID of the additional information such as the coupon is made to correspond to the ID of the object such as the specific building or signboard. - The additional
information acquiring portion 60 acquires the additional information ID made to correspond to the object ID specified by theimage recognizing portion 58 by referring to theadditional information database 66. The additionalinformation acquiring portion 60 further acquires the additional information such as the image, the text, and the sound associated with the additional information ID, and gives the additionalinformation registering portion 62 the additional information thus acquired. - The additional
information acquiring portion 60 transmits the additional information thus acquired to the terminal 100 through thecommunication portion 70. Theimage recognizing portion 58, as may be necessary, transmits the thumbnail image of the object associated with the specified object ID to the terminal 100 through thecommunication portion 70. The terminal 100 displays the additional information acquired from thecloud server 200 on the display device of thedisplay portion 28 with the additional information being superimposed on the photographed image. If necessary, the terminal 100 displays the thumbnail image of the specified object as well together with the additional information on the display device of thedisplay portion 28. - In response to a request to register the additional information on the user from the
communication portion 70, the additionalinformation registering portion 62 registers the additional information ID acquired from the additionalinformation acquiring portion 60 in auser database 68 with the additional information ID thus acquired being associated with either the user ID or the apparatus ID. -
FIG. 5 is a diagram explaining arecord 86 in which the user ID/apparatus ID stored in theuser database 68, and the additional information ID are made to correspond to each other. Therecord 86 is used to manage the acquired additional information ID every either user or apparatus. The record is registered in theuser database 68 in such a manner to thereby disenable the same additional information to be acquired twice by the same user or in the same apparatus. -
FIG. 6 is a sequence diagram explaining the provision of the additional information, and the flow of the registration by theterminal 100 of the first embodiment, and thecloud server 200 of the second embodiment. - In the terminal 100, the photographing
portion 10 photographs the reality space (S10). The additionalinformation requesting portion 18 transmits the additional information request together with the information on the photographed image to the cloud server 200 (S12). - In the
cloud server 200, theimage recognizing portion 58 specifies the object from the photographed image based on the image recognition (S14). The additionalinformation acquiring portion 60 acquires the additional information associated with the object the image of which is recognized (S16). The additionalinformation acquiring portion 60 transmits the additional information thus acquired to the terminal 100 (S18). - In the terminal 100, the additional
information acquiring portion 22 receives the additional information from thecloud server 200. Also, the augmentedreality creating portion 26 subjects the fact that the additional information is present in the object of the photographed image to the augmented reality, and displays the resulting information on the display device of the display portion 28 (S20). Theuser interface portion 24 receives the instruction to select the additional information which is displayed in the photographed image from the user (S22). - When the function of the IC chip is mounted to the terminal 100 (Y in S24), the additional information selected by the
user interface portion 24 is registered in a security area of the IC chip (S26). - When no function of the IC chip is mounted to the terminal 100 (N in S24), the
user interface portion 24 transmits the request to register the additional information selected by theuser interface portion 24 to the cloud server 200 (S28). In thecloud server 200, the additionalinformation registering portion 62 registers the additional information selected by the user in theuser database 68 with the additional information being associated with either the user ID or the apparatus ID (S30). -
FIGS. 7A to 7D are respectively views explaining examples in each of which the additional information is displayed as the augmented reality in the object in the photographed image. - A description will now be given by giving a
tablet terminal 500 as an example of the terminal 100.FIG. 7A is a view explaining a photographed image which is displayed on atouch panel 510 of thetablet terminal 500. The image captured with the built-in camera of thetablet terminal 500 is displayed on thetouch panel 510. Thus, the user manipulates amanipulation button 520 or directly touches thetouch panel 510 to thereby enable he/she to interact with the augmented reality which is being displayed on thetouch panel 510. - In this case, the user carries out the photographing by pointing the built-in camera of the
tablet terminal 500 at a building having a signboard of “ABC beer” mounted thereto. The information on the photographed image is transmitted to thecloud server 200 by using the communication function of thetablet terminal 500. - The
cloud server 200 recognizes the image of the object from the image which has been captured with thetablet terminal 500. In this case, the information on thesignboard 552 of the ABC beer is registered in theobject database 64. Thus, theimage recognizing portion 58 extracts thesignboard 552 as the object from the photographed image. Thecloud server 200 transmits the information which specifies the image area of the extracted object to thetablet terminal 500. -
FIG. 7B is a view explaining the object which is specified within the photographed image. Thetablet terminal 500 carries out highlighting for, for example, surrounding the surround of thesignboard 552 by aframe 550 indicated by a dotted line by using the information which specifies the image area of the extracted object. Thus, thetablet terminal 500 informs the user of that thesignboard 552 is extracted as the object. - In the
cloud server 200, the additionalinformation acquiring portion 60 acquires the additional information associated with the extracted object from theadditional information database 66, and transmits the additional information thus acquired to thetablet terminal 500. -
FIG. 7C is a view explaining the additional information which is superimposed on the object within the photographed image. In thetablet terminal 500, the augmentedreality creating portion 26 creates an image showing asituation 554 that a page is turned so as for the user to understand that the additional information is provided on thesignboard 552 as the augmented reality. -
FIG. 7D is a view explaining an example in which the coupon is displayed as the additional information in the position of the object within the photographed image. When theuser interface portion 24 has received an operation for tapping on thesignboard 552 from the user, the page is turned, and thus acoupon 555 is displayed as the additional information in the position of thesignboard 552. In this case, thecoupon 555 is displayed which can be exchanged with a service with which the beer is all you can drink in a beer garden on a roof of this building. By selecting thecoupon 555, the user can preserve the information on thecoupon 555 in the security area of the IC chip built in thetablet terminal 500, or can transmit the information on thecoupon 555 to thecloud server 200 to thereby register therein the information on thecoupon 555 with the information on thecoupon 555 being associated with either the user ID or the apparatus ID. -
FIGS. 8A to 8D are respectively views explaining different examples in each of which the additional information is displayed as the augmented reality in the object in the photographed image. - As shown in
FIG. 8A , a photographed image of a scene containing therein an image of theTokyo Tower 532 is displayed on thetouch panel 510 of thetablet terminal 500. Theimage recognizing portion 58 of thecloud server 200 extracts the image of theTokyo Tower 532 as the object based on the feature information from the photographed image. Also, aframe 530 representing that the image of theTokyo Tower 532 is extracted as the object is displayed on thetouch panel 510. - The additional
information acquiring portion 60 of thecloud server 200 acquires the additional information (the coupon in this case) associated with the extracted object. As shown inFIG. 8B , in thetouch panel 510 of thetablet terminal 500, anicon 534 representing that there is the coupon in the extracted object is displayed as the augmented reality on the image of theTokyo Tower 532 so as to be superimposed thereon. - When the
user interface portion 24 has received an instruction to select theicon 534 from the user, as shown inFIG. 8C , acoupon 540 is displayed within the photographed image. Thecoupon 540 is a premium ticket with which a hotel accommodation plan with the Tokyo Tower observatory ticket is provided in 20% OFF. When the user has selected thecoupon 540, the information on thecoupon 540 is stored in the security area within the IC chip. - In the
cloud server 200, when theimage recognizing portion 58 takes the matching between the object the information on which is registered in theobject database 64, and the subject of the photographed image based on the feature information, a plurality of objects are found as candidates in some cases. For example, when the Eiffel Tower having the feature similar to that of the Tokyo Tower is found as the candidate of the object, theimage recognizing portion 58 gives the additionalinformation acquiring portion 60 both of the image of the Tokyo Tower and the image of the Eiffel Tower as the candidates of the object. The additionalinformation acquiring portion 60 acquires the additional information associated with the image of the Tokyo Tower and the additional information associated with the image of the Eiffel Tower from theadditional information database 66, and transmits both of the two pieces of additional information to thetablet terminal 500. In addition, when the plurality of candidates are present as the objects, theimage recognizing portion 58 acquires thumbnail images of the candidates of the objects from theobject database 64, and transmits these pieces of information on the thumbnail images thus acquired to thetablet terminal 500. - As shown in
FIG. 8D , thecoupon 540 associated with the image of the Tokyo Tower, and acoupon 542 associated with the Eiffel Tower are both displayed on thetouch panel 510 of thetablet terminal 500. The contents of thecoupon 542 associated with the image of the Eiffel Tower are a valuable coupon with which shopping in Paris can be enjoyed in 10% OFF. - At this time, the thumbnail images of the objects are displayed in the coupons, respectively. The contents of the coupon are displayed together with a
thumbnail image 541 of the Tokyo Tower are displayed in thecoupon 540 associated with the image of the Tokyo Tower. Also, the contents of the coupon are displayed together with athumbnail image 543 of the Eiffel Tower are displayed in thecoupon 542 associated with the image of the Eiffel Tower. - The user can judge whether or not the objects are properly determined by looking at the thumbnail images displayed in the coupons, respectively. In this example, it is understood that although the coupon in which the thumbnail image of the Tokyo Tower is displayed is proper, the coupon in which the thumbnail image of the Eiffel Tower is displayed is misjudged.
- When there is a plurality of candidates of the objects in such a manner, with only the image recognition, the Tokyo Tower is misrecognized as the Eiffel Tower, and thus the shopping coupon of Paris which is unnecessary for the user is provided for the user. In order to cope with such a situation, the photographing
position acquiring portion 52 of thecloud server 200 may acquire the information on the photographing position from thetablet terminal 500, and theimage recognizing portion 58 may narrow down the candidates of the object based on the photographing position. In this example, since the photographing position is Tokyo and is at a long distance from Paris, the Eiffel Tower can be weeded out from the candidates of the object. - In addition, when a plurality of candidates are present for the object, a constitution can be adopted such that the probability which is determined based on the image matching or the position information is displayed as a numerical value in the coupon; the coupons are sorted in the order of decreasing the probability; or the coupon having the small probability is displayed small, whereby the user can easily select the proper coupon based on the probability.
- As another example, the different coupons may be brought up based on the date and time of the photographing. For example, the coupons which are different from one another depending on the time zone may be brought up for the same object in such a way that the coupon of the bus sightseeing of Tokyo is brought up, for the Tokyo Tower which was photographed in the early morning, and the coupon of the dinner is brought up for the Tokyo Tower which was photographed in the evening. Alternatively, by using the information on the day of the photographing, the coupon of introducing the place famous for cherry blossoms in the suburbs of Tokyo may be brought up for the Tokyo Tower which was photographed in the spring. Also, the coupon of introducing the place famous for beautiful autumn leaves in the suburbs of Tokyo may be brought up for the Tokyo Tower which was photographed in the autumn.
- As still another example, when the construction such as the tower, the bridge or the building is lighted up in the night, even for the same construction, the coupons which are different from one another depending on the color or pattern of the lighting-up may also be brought up. For example, the kind or presentation source of the coupon may also be made to differ depending on the illumination in such a way that when the bridge which is lighted up blue is photographed by a company, the coupon which is provided by a certain company is provided, but when the bridge which is lighted up in rainbow color is photographed by the same company, the coupon which is provided by another company is provided.
- The present disclosure has been described so far based on the embodiments. Note that, it is understood by those skilled in the art that the embodiments are merely the exemplifications, various modified changes can be made in combination of the constituent elements thereof or processing processes, and these modified changes also fall within the scope of the present disclosure.
- Although in the above description, the examples in each of which the outside world is photographed with the mobile apparatus, the subject which shall be photographed may also be the television picture or the game picture. For example, it may also be adopted that when a specific scene or a commercial message of a television program is photographed with the terminal 100, and information on an image thereof is transmitted to the
cloud server 200, a coupon is provided as the additional information from thecloud server 200 to the terminal 100. In addition, it may also be adopted that when a picture of a specific stage or a specific character in a game is photographed with the terminal 100, and information on an image thereof is transmitted to thecloud server 200, a digital item which can be utilized in the game is provided as the additional information from thecloud server 200 to the terminal 100. - The present disclosure contains subject matter related to that disclosed in Japanese Priority Patent Application JP 2013-229699 filed in the Japan Patent Office on Nov. 5, 2013, the entire content of which is hereby incorporated by reference.
Claims (11)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013-229699 | 2013-11-05 | ||
JP2013229699A JP2015090553A (en) | 2013-11-05 | 2013-11-05 | Terminal apparatus, additional information management apparatus, and additional information management method |
Publications (2)
Publication Number | Publication Date |
---|---|
US20150124106A1 true US20150124106A1 (en) | 2015-05-07 |
US9558593B2 US9558593B2 (en) | 2017-01-31 |
Family
ID=53006765
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/514,558 Active 2035-04-28 US9558593B2 (en) | 2013-11-05 | 2014-10-15 | Terminal apparatus, additional information managing apparatus, additional information managing method, and program |
Country Status (2)
Country | Link |
---|---|
US (1) | US9558593B2 (en) |
JP (1) | JP2015090553A (en) |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160086381A1 (en) * | 2014-09-23 | 2016-03-24 | Samsung Electronics Co., Ltd. | Method for providing virtual object and electronic device therefor |
WO2018035564A1 (en) * | 2016-08-26 | 2018-03-01 | tagSpace Pty Ltd | Teleportation links for mixed reality environments |
US9986113B2 (en) | 2016-05-06 | 2018-05-29 | Fuji Xerox Co., Ltd. | Information processing apparatus and nontransitory computer readable medium |
CN108550190A (en) * | 2018-04-19 | 2018-09-18 | 腾讯科技(深圳)有限公司 | Augmented reality data processing method, device, computer equipment and storage medium |
US10382634B2 (en) | 2016-05-06 | 2019-08-13 | Fuji Xerox Co., Ltd. | Information processing apparatus and non-transitory computer readable medium configured to generate and change a display menu |
US10403044B2 (en) | 2016-07-26 | 2019-09-03 | tagSpace Pty Ltd | Telelocation: location sharing for users in augmented and virtual reality environments |
US10762382B2 (en) | 2017-01-11 | 2020-09-01 | Alibaba Group Holding Limited | Image recognition based on augmented reality |
US10825252B2 (en) * | 2018-03-30 | 2020-11-03 | Kabushiki Kaisha Square Enix | Information processing program, method, and system for sharing virtual process for real object in real world using augmented reality |
US11070637B2 (en) | 2016-12-13 | 2021-07-20 | Advanced New Technologies Co., Ltd | Method and device for allocating augmented reality-based virtual objects |
US11302082B2 (en) | 2016-05-23 | 2022-04-12 | tagSpace Pty Ltd | Media tags—location-anchored digital media for augmented reality and virtual reality environments |
EP4167577A4 (en) * | 2020-06-10 | 2023-09-27 | Sony Group Corporation | Information processing device, information processing method, imaging device, and image transfer system |
Families Citing this family (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6674620B2 (en) * | 2015-03-31 | 2020-04-01 | 日本電気株式会社 | Information processing system, information processing system control method, smart device control method, and smart device control program |
CN107466477B (en) * | 2015-11-12 | 2021-08-10 | 松下电器(美国)知识产权公司 | Display method, computer-readable recording medium, and display device |
CN105338117B (en) * | 2015-11-27 | 2018-05-29 | 亮风台(上海)信息科技有限公司 | For generating AR applications and method, equipment and the system of AR examples being presented |
JP6436922B2 (en) * | 2016-01-19 | 2018-12-12 | ヤフー株式会社 | Generating device, generating method, and generating program |
TWI611307B (en) * | 2016-08-24 | 2018-01-11 | 李雨暹 | Method for establishing location-based space object, method for displaying space object, and application system thereof |
JP2018097581A (en) * | 2016-12-13 | 2018-06-21 | 富士ゼロックス株式会社 | Information processing device and program |
JP6523362B2 (en) * | 2017-03-30 | 2019-05-29 | Kddi株式会社 | Server device, terminal device and program |
TWI642002B (en) * | 2017-04-14 | 2018-11-21 | 李雨暹 | Method and system for managing viewability of location-based spatial object |
JP6891055B2 (en) * | 2017-06-28 | 2021-06-18 | キヤノン株式会社 | Image processing equipment, image processing methods, and programs |
JP6864593B2 (en) * | 2017-09-27 | 2021-04-28 | 株式会社Nttドコモ | Information processing device |
JP2019101783A (en) * | 2017-12-04 | 2019-06-24 | キヤノン株式会社 | Information processing device and method |
US10360454B1 (en) | 2017-12-28 | 2019-07-23 | Rovi Guides, Inc. | Systems and methods for presenting supplemental content in augmented reality |
CN108932632A (en) * | 2018-06-01 | 2018-12-04 | 北京市商汤科技开发有限公司 | Advertisement interactive approach and device, electronic equipment and storage medium |
JP7044665B2 (en) * | 2018-08-28 | 2022-03-30 | ヤフー株式会社 | Generator, generation method, and generation program |
JP2020064555A (en) * | 2018-10-19 | 2020-04-23 | 中国電力株式会社 | Patrol server, and patrol and inspection system |
CN109947964A (en) * | 2019-04-02 | 2019-06-28 | 北京字节跳动网络技术有限公司 | Method and apparatus for more new information |
KR20230048864A (en) * | 2021-10-05 | 2023-04-12 | 현대자동차주식회사 | User equipment and control method for the same |
Citations (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060227992A1 (en) * | 2005-04-08 | 2006-10-12 | Rathus Spencer A | System and method for accessing electronic data via an image search engine |
US20080147730A1 (en) * | 2006-12-18 | 2008-06-19 | Motorola, Inc. | Method and system for providing location-specific image information |
US20080268876A1 (en) * | 2007-04-24 | 2008-10-30 | Natasha Gelfand | Method, Device, Mobile Terminal, and Computer Program Product for a Point of Interest Based Scheme for Improving Mobile Visual Searching Functionalities |
US20090061901A1 (en) * | 2007-09-04 | 2009-03-05 | Juha Arrasvuori | Personal augmented reality advertising |
US20090102859A1 (en) * | 2007-10-18 | 2009-04-23 | Yahoo! Inc. | User augmented reality for camera-enabled mobile devices |
US20090232354A1 (en) * | 2008-03-11 | 2009-09-17 | Sony Ericsson Mobile Communications Ab | Advertisement insertion systems and methods for digital cameras based on object recognition |
US20100260426A1 (en) * | 2009-04-14 | 2010-10-14 | Huang Joseph Jyh-Huei | Systems and methods for image recognition using mobile devices |
US20110161875A1 (en) * | 2009-12-29 | 2011-06-30 | Nokia Corporation | Method and apparatus for decluttering a mapping display |
US20110199479A1 (en) * | 2010-02-12 | 2011-08-18 | Apple Inc. | Augmented reality maps |
US20110209201A1 (en) * | 2010-02-19 | 2011-08-25 | Nokia Corporation | Method and apparatus for accessing media content based on location |
US20120062595A1 (en) * | 2010-09-09 | 2012-03-15 | Pantech Co., Ltd. | Method and apparatus for providing augmented reality |
US20120096403A1 (en) * | 2010-10-18 | 2012-04-19 | Lg Electronics Inc. | Mobile terminal and method of managing object related information therein |
US20120122491A1 (en) * | 2009-07-30 | 2012-05-17 | Sk Planet Co., Ltd. | Method for providing augmented reality, server for same, and portable terminal |
US20120200743A1 (en) * | 2011-02-08 | 2012-08-09 | Autonomy Corporation Ltd | System to augment a visual data stream based on a combination of geographical and visual information |
US8301202B2 (en) * | 2009-08-27 | 2012-10-30 | Lg Electronics Inc. | Mobile terminal and controlling method thereof |
US20120310717A1 (en) * | 2011-05-31 | 2012-12-06 | Nokia Corporation | Method and apparatus for controlling a perspective display of advertisements using sensor data |
US20130033522A1 (en) * | 2011-03-08 | 2013-02-07 | Bank Of America Corporation | Prepopulating application forms using real-time video analysis of identified objects |
US8392450B2 (en) * | 2011-02-08 | 2013-03-05 | Autonomy Corporation Ltd. | System to augment a visual data stream with user-specific content |
US20130147837A1 (en) * | 2011-12-13 | 2013-06-13 | Matei Stroila | Augmented reality personalization |
US20130155107A1 (en) * | 2011-12-16 | 2013-06-20 | Identive Group, Inc. | Systems and Methods for Providing an Augmented Reality Experience |
US20130257900A1 (en) * | 2012-03-30 | 2013-10-03 | Nokia Corporation | Method and apparatus for storing augmented reality point-of-interest information |
US20130297407A1 (en) * | 2012-05-04 | 2013-11-07 | Research In Motion Limited | Interactive advertising on a mobile device |
US20130293580A1 (en) * | 2012-05-01 | 2013-11-07 | Zambala Lllp | System and method for selecting targets in an augmented reality environment |
US20140025481A1 (en) * | 2012-07-20 | 2014-01-23 | Lg Cns Co., Ltd. | Benefit promotion advertising in an augmented reality environment |
US20140079281A1 (en) * | 2012-09-17 | 2014-03-20 | Gravity Jack, Inc. | Augmented reality creation and consumption |
US20140146082A1 (en) * | 2012-11-26 | 2014-05-29 | Ebay Inc. | Augmented reality information system |
US20140185871A1 (en) * | 2012-12-27 | 2014-07-03 | Sony Corporation | Information processing apparatus, content providing method, and computer program |
US20140253743A1 (en) * | 2012-05-10 | 2014-09-11 | Hewlett-Packard Development Company, L.P. | User-generated content in a virtual reality environment |
US8847987B2 (en) * | 2010-11-17 | 2014-09-30 | Samsung Electronics Co., Ltd. | System and method for controlling device |
US20140304321A1 (en) * | 2013-04-08 | 2014-10-09 | Navteq B.V. | Desktop Application Synchronization to Process Data Captured on a Mobile Device |
US8863004B2 (en) * | 2011-10-28 | 2014-10-14 | Navteq B.V. | Method and apparatus for increasing the functionality of a user device in a locked state |
US20150032838A1 (en) * | 2013-07-29 | 2015-01-29 | Aol Advertising Inc. | Systems and methods for caching augmented reality target data at user devices |
US20150058229A1 (en) * | 2013-08-23 | 2015-02-26 | Nantmobile, Llc | Recognition-based content management, systems and methods |
US20150317836A1 (en) * | 2014-05-05 | 2015-11-05 | Here Global B.V. | Method and apparatus for contextual query based on visual elements and user input in augmented reality at a device |
US9349350B2 (en) * | 2012-10-24 | 2016-05-24 | Lg Electronics Inc. | Method for providing contents along with virtual information and a digital device for the same |
-
2013
- 2013-11-05 JP JP2013229699A patent/JP2015090553A/en active Pending
-
2014
- 2014-10-15 US US14/514,558 patent/US9558593B2/en active Active
Patent Citations (36)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060227992A1 (en) * | 2005-04-08 | 2006-10-12 | Rathus Spencer A | System and method for accessing electronic data via an image search engine |
US20080147730A1 (en) * | 2006-12-18 | 2008-06-19 | Motorola, Inc. | Method and system for providing location-specific image information |
US20080268876A1 (en) * | 2007-04-24 | 2008-10-30 | Natasha Gelfand | Method, Device, Mobile Terminal, and Computer Program Product for a Point of Interest Based Scheme for Improving Mobile Visual Searching Functionalities |
US20090061901A1 (en) * | 2007-09-04 | 2009-03-05 | Juha Arrasvuori | Personal augmented reality advertising |
US20090102859A1 (en) * | 2007-10-18 | 2009-04-23 | Yahoo! Inc. | User augmented reality for camera-enabled mobile devices |
US20090232354A1 (en) * | 2008-03-11 | 2009-09-17 | Sony Ericsson Mobile Communications Ab | Advertisement insertion systems and methods for digital cameras based on object recognition |
US20100260426A1 (en) * | 2009-04-14 | 2010-10-14 | Huang Joseph Jyh-Huei | Systems and methods for image recognition using mobile devices |
US20120122491A1 (en) * | 2009-07-30 | 2012-05-17 | Sk Planet Co., Ltd. | Method for providing augmented reality, server for same, and portable terminal |
US8301202B2 (en) * | 2009-08-27 | 2012-10-30 | Lg Electronics Inc. | Mobile terminal and controlling method thereof |
US20110161875A1 (en) * | 2009-12-29 | 2011-06-30 | Nokia Corporation | Method and apparatus for decluttering a mapping display |
US20110199479A1 (en) * | 2010-02-12 | 2011-08-18 | Apple Inc. | Augmented reality maps |
US20110209201A1 (en) * | 2010-02-19 | 2011-08-25 | Nokia Corporation | Method and apparatus for accessing media content based on location |
US20120062595A1 (en) * | 2010-09-09 | 2012-03-15 | Pantech Co., Ltd. | Method and apparatus for providing augmented reality |
US20120096403A1 (en) * | 2010-10-18 | 2012-04-19 | Lg Electronics Inc. | Mobile terminal and method of managing object related information therein |
US8847987B2 (en) * | 2010-11-17 | 2014-09-30 | Samsung Electronics Co., Ltd. | System and method for controlling device |
US20120200743A1 (en) * | 2011-02-08 | 2012-08-09 | Autonomy Corporation Ltd | System to augment a visual data stream based on a combination of geographical and visual information |
US8392450B2 (en) * | 2011-02-08 | 2013-03-05 | Autonomy Corporation Ltd. | System to augment a visual data stream with user-specific content |
US20130033522A1 (en) * | 2011-03-08 | 2013-02-07 | Bank Of America Corporation | Prepopulating application forms using real-time video analysis of identified objects |
US20160163000A1 (en) * | 2011-03-08 | 2016-06-09 | Bank Of America Corporation | Populating budgets and/or wish lists using real-time video image analysis |
US20120310717A1 (en) * | 2011-05-31 | 2012-12-06 | Nokia Corporation | Method and apparatus for controlling a perspective display of advertisements using sensor data |
US8863004B2 (en) * | 2011-10-28 | 2014-10-14 | Navteq B.V. | Method and apparatus for increasing the functionality of a user device in a locked state |
US20130147837A1 (en) * | 2011-12-13 | 2013-06-13 | Matei Stroila | Augmented reality personalization |
US20130155107A1 (en) * | 2011-12-16 | 2013-06-20 | Identive Group, Inc. | Systems and Methods for Providing an Augmented Reality Experience |
US20130257900A1 (en) * | 2012-03-30 | 2013-10-03 | Nokia Corporation | Method and apparatus for storing augmented reality point-of-interest information |
US20130293580A1 (en) * | 2012-05-01 | 2013-11-07 | Zambala Lllp | System and method for selecting targets in an augmented reality environment |
US20130297407A1 (en) * | 2012-05-04 | 2013-11-07 | Research In Motion Limited | Interactive advertising on a mobile device |
US20140253743A1 (en) * | 2012-05-10 | 2014-09-11 | Hewlett-Packard Development Company, L.P. | User-generated content in a virtual reality environment |
US20140025481A1 (en) * | 2012-07-20 | 2014-01-23 | Lg Cns Co., Ltd. | Benefit promotion advertising in an augmented reality environment |
US20140079281A1 (en) * | 2012-09-17 | 2014-03-20 | Gravity Jack, Inc. | Augmented reality creation and consumption |
US9349350B2 (en) * | 2012-10-24 | 2016-05-24 | Lg Electronics Inc. | Method for providing contents along with virtual information and a digital device for the same |
US20140146082A1 (en) * | 2012-11-26 | 2014-05-29 | Ebay Inc. | Augmented reality information system |
US20140185871A1 (en) * | 2012-12-27 | 2014-07-03 | Sony Corporation | Information processing apparatus, content providing method, and computer program |
US20140304321A1 (en) * | 2013-04-08 | 2014-10-09 | Navteq B.V. | Desktop Application Synchronization to Process Data Captured on a Mobile Device |
US20150032838A1 (en) * | 2013-07-29 | 2015-01-29 | Aol Advertising Inc. | Systems and methods for caching augmented reality target data at user devices |
US20150058229A1 (en) * | 2013-08-23 | 2015-02-26 | Nantmobile, Llc | Recognition-based content management, systems and methods |
US20150317836A1 (en) * | 2014-05-05 | 2015-11-05 | Here Global B.V. | Method and apparatus for contextual query based on visual elements and user input in augmented reality at a device |
Cited By (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10242031B2 (en) * | 2014-09-23 | 2019-03-26 | Samsung Electronics Co., Ltd. | Method for providing virtual object and electronic device therefor |
US20160086381A1 (en) * | 2014-09-23 | 2016-03-24 | Samsung Electronics Co., Ltd. | Method for providing virtual object and electronic device therefor |
US11159687B2 (en) | 2016-05-06 | 2021-10-26 | Fujifilm Business Innovation Corp. | Information processing apparatus and non-transitory computer readable medium |
US11800026B2 (en) * | 2016-05-06 | 2023-10-24 | Fujifilm Business Innovation Corp. | Information processing apparatus and non-transitory computer readable medium |
US9986113B2 (en) | 2016-05-06 | 2018-05-29 | Fuji Xerox Co., Ltd. | Information processing apparatus and nontransitory computer readable medium |
US11558514B2 (en) | 2016-05-06 | 2023-01-17 | Fujifilm Business Innovation Corp. | Information processing apparatus and non-transitory computer readable medium |
US10362184B2 (en) | 2016-05-06 | 2019-07-23 | Fuji Xerox Co., Ltd. | Information processing apparatus and nontransitory computer readable medium |
US10375258B2 (en) | 2016-05-06 | 2019-08-06 | Fuji Xerox Co., Ltd. | Information processing apparatus and nontransitory computer readable medium |
US10382634B2 (en) | 2016-05-06 | 2019-08-13 | Fuji Xerox Co., Ltd. | Information processing apparatus and non-transitory computer readable medium configured to generate and change a display menu |
US10469682B2 (en) | 2016-05-06 | 2019-11-05 | Fuji Xerox Co., Ltd. | Information processing apparatus and nontransitory computer readable medium |
US10764452B2 (en) | 2016-05-06 | 2020-09-01 | Fuji Xerox Co., Ltd. | Information processing apparatus and non-transitory computer readable medium |
CN114237528A (en) * | 2016-05-06 | 2022-03-25 | 富士胶片商业创新有限公司 | Non-transitory computer-readable medium and information processing apparatus |
US11302082B2 (en) | 2016-05-23 | 2022-04-12 | tagSpace Pty Ltd | Media tags—location-anchored digital media for augmented reality and virtual reality environments |
US11967029B2 (en) | 2016-05-23 | 2024-04-23 | tagSpace Pty Ltd | Media tags—location-anchored digital media for augmented reality and virtual reality environments |
US10403044B2 (en) | 2016-07-26 | 2019-09-03 | tagSpace Pty Ltd | Telelocation: location sharing for users in augmented and virtual reality environments |
US10831334B2 (en) | 2016-08-26 | 2020-11-10 | tagSpace Pty Ltd | Teleportation links for mixed reality environments |
WO2018035564A1 (en) * | 2016-08-26 | 2018-03-01 | tagSpace Pty Ltd | Teleportation links for mixed reality environments |
US11070637B2 (en) | 2016-12-13 | 2021-07-20 | Advanced New Technologies Co., Ltd | Method and device for allocating augmented reality-based virtual objects |
US11290550B2 (en) | 2016-12-13 | 2022-03-29 | Advanced New Technologies Co., Ltd. | Method and device for allocating augmented reality-based virtual objects |
US10762382B2 (en) | 2017-01-11 | 2020-09-01 | Alibaba Group Holding Limited | Image recognition based on augmented reality |
US10825252B2 (en) * | 2018-03-30 | 2020-11-03 | Kabushiki Kaisha Square Enix | Information processing program, method, and system for sharing virtual process for real object in real world using augmented reality |
US11328490B2 (en) * | 2018-03-30 | 2022-05-10 | Kabushiki Kaisha Square Enix | Information processing program, method, and system for sharing virtual process for real object arranged in a real world using augmented reality |
CN108550190A (en) * | 2018-04-19 | 2018-09-18 | 腾讯科技(深圳)有限公司 | Augmented reality data processing method, device, computer equipment and storage medium |
EP4167577A4 (en) * | 2020-06-10 | 2023-09-27 | Sony Group Corporation | Information processing device, information processing method, imaging device, and image transfer system |
Also Published As
Publication number | Publication date |
---|---|
JP2015090553A (en) | 2015-05-11 |
US9558593B2 (en) | 2017-01-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9558593B2 (en) | Terminal apparatus, additional information managing apparatus, additional information managing method, and program | |
US11048941B2 (en) | Imaging device and information acquisition system in which an acquired image and associated information are held on a display | |
CN101093542B (en) | Inquiry system, imaging device, inquiry device, information processing method | |
US9479914B2 (en) | Intuitive computing methods and systems | |
US20060114338A1 (en) | Device and method for embedding and retrieving information in digital images | |
JP2012215989A (en) | Augmented reality display method | |
CN106027841A (en) | Portable information device, imaging apparatus and information acquisition system | |
KR20100138863A (en) | Providing method of augmented reality and personal contents corresponding to code in terminal with camera | |
JP2008027336A (en) | Location information delivery apparatus, camera, location information delivery method and program | |
US7796776B2 (en) | Digital image pickup device, display device, rights information server, digital image management system and method using the same | |
JP2022000795A (en) | Information management device | |
JP2024020650A (en) | Imaged data provision system | |
US9424361B2 (en) | Information communication method and information communication apparatus | |
CN111712807B (en) | Portable information terminal, information presentation system, and information presentation method | |
JP2020057267A (en) | Reservation slip print system and reservation slip generation system | |
CN111145189B (en) | Image processing method, apparatus, electronic device, and computer-readable storage medium | |
JP4940804B2 (en) | POSITION INFORMATION DISTRIBUTION METHOD, POSITION INFORMATION DISTRIBUTION DEVICE, AND POSITION INFORMATION DISTRIBUTION PROGRAM | |
US10997410B2 (en) | Information processing device and information processing system | |
US20200072625A1 (en) | Information processing device, information processing method, and recording medium | |
KR20150125776A (en) | Apparatus and method for providing message based object | |
KR101365637B1 (en) | Image capturing system | |
JP2006178804A (en) | Object information providing method and object information providing server | |
JP5814219B2 (en) | Information distribution system and information distribution program | |
JP7225759B2 (en) | Route search server, program, route search system and route search method | |
KR20160061629A (en) | Users sphere with the center of the map location-based services |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SONY COMPUTER ENTERTAINMENT INC., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TOMITA, HISASHI;REEL/FRAME:033952/0490 Effective date: 20140625 |
|
AS | Assignment |
Owner name: SONY CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SONY INTERACTIVE ENTERTAINMENT INC.;REEL/FRAME:040604/0134 Effective date: 20161201 Owner name: SONY INTERACTIVE ENTERTAINMENT INC., JAPAN Free format text: CHANGE OF NAME;ASSIGNOR:SONY COMPUTER ENTERTAINMENT INC.;REEL/FRAME:040604/0108 Effective date: 20160401 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: SONY INTERACTIVE ENTERTAINMENT INC., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SONY CORPORATION;REEL/FRAME:043761/0975 Effective date: 20170825 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |