CN102779153A - Information processing apparatus and information processing method - Google Patents

Information processing apparatus and information processing method Download PDF

Info

Publication number
CN102779153A
CN102779153A CN2012101496221A CN201210149622A CN102779153A CN 102779153 A CN102779153 A CN 102779153A CN 2012101496221 A CN2012101496221 A CN 2012101496221A CN 201210149622 A CN201210149622 A CN 201210149622A CN 102779153 A CN102779153 A CN 102779153A
Authority
CN
China
Prior art keywords
search
image
content
ordering
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2012101496221A
Other languages
Chinese (zh)
Other versions
CN102779153B (en
Inventor
久保拓也
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Publication of CN102779153A publication Critical patent/CN102779153A/en
Application granted granted Critical
Publication of CN102779153B publication Critical patent/CN102779153B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/583Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • G06F16/5854Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using shape and object relationship
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/245Query processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/248Presentation of query results
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/51Indexing; Data structures therefor; Storage structures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/583Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Library & Information Science (AREA)
  • Computational Linguistics (AREA)
  • Software Systems (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The present invention relates to an information processing apparatus for performing content search and an information processing method. The information processing apparatus performs a search for contents that match a predetermined search condition, generates list data of identification information corresponding to the found contents, wherein identification information corresponding to contents that were newly found while the search is being performed is added to the list data that has been generated so far, in accordance with a sorting condition, repeatedly sorts a sequence of the identification information included in the list data in accordance with a predetermined timing. The identification information corresponding to contents that were newly found is added to the list data without conforming to the sorting condition, from when the sorting was performed by the sorting unit until when the sorting is performed next.

Description

Messaging device and information processing method
Technical field
The present invention relates to messaging device and information processing method.
Background technology
But when the search content of search such as image etc., there is technology according to predetermined clooating sequence display of search results.When with the content of sortord display of search results, consider usually to use whenever the method that finds the content that is complementary with condition just to sort.According to TOHKEMY 2007-102549, carry out facial zone to each image and extract, to facial zone calculated characteristics amount, and calculate the face-image sequencing display image that has finished with characteristic quantity.
Summary of the invention
When searching for, find under the situation of expectation content, have the situation expect chosen content immediately and to carry out next operation.Yet when when searching for, attempting chosen content, under situation about just sorting whenever the content that finds with condition coupling, the content that select maybe be with moving to different positions, and will select the content of mistake.Especially, the characteristic quantity that search content or use calculate to face-image on network uses under this technological situation when searching for, and each searches for elapsed time, and possibly when searching for, image frequently sorted.
One embodiment of the present of invention relate to a kind of messaging device, and it is used to carry out content search, and said messaging device comprises: search unit is used for the content that meets predetermined search condition is searched for; Generation unit; Be used to generate and table data through the corresponding identifying information of content that said search unit found; Wherein, When said search unit is just being searched for, said generation unit will be added into the table data that is generated till current with the corresponding identifying information of new-found content; And sequencing unit; Be used for repeating to sort with the sequence of predetermined timing to the included identifying information of said table data according to sort criteria; Wherein, Till carrying out next ordering, said generation unit will be added into said table data with the corresponding identifying information of the new-found content of said search unit with the mode not according to said sort criteria when said sequencing unit sorts.
Other embodiments of the invention relate to a kind of information processing method that is used to carry out content search, comprising: search step is used for the content that meets predetermined search condition is searched for; Generate step; Be used for generating with in the table data of the corresponding identifying information of content that said search step found; Wherein, When in said search step, just searching for, in said generation step, will be added into the table data that generates till current with the corresponding identifying information of new-found content; And ordered steps; Be used for repeating ordering with the sequence of the predetermined identifying information that regularly said table data is comprised according to sort criteria; Wherein, Till carrying out next ordering, in said generation step, will be added into said table data with the corresponding identifying information of new-found content in said search step when said ordered steps, sorting with mode not according to said sort criteria.
Through below with reference to the explanation of accompanying drawing to exemplary embodiments, it is obvious that further feature of the present invention will become.
Description of drawings
Fig. 1 is the block diagram that illustrates according to the example of structure of the messaging device of the embodiment of the invention.
Fig. 2 is the process flow diagram that the example of handling according to the content search of the embodiment of the invention is shown.
Fig. 3 is the figure that illustrates according to the example of the demonstration of the key frame of the image management application of the embodiment of the invention.
Fig. 4 is the figure that illustrates according to the example that picture is set under the situation of the search of the face in the carries out image management and application program of the embodiment of the invention.
Fig. 5 A ~ 5C is the figure that is used for explaining the Search Results before and after the ordering under the situation of carrying out facial search according to the image management application of the embodiment of the invention.
Fig. 6 A and 6B are used for explaining the figure carry out the example of the demonstration under the situation of selecting the content in the Search Results when face is searched for according to the image management application of the embodiment of the invention.
Fig. 7 is the process flow diagram of example of processing of ordering that is used for temporarily stopping the Search Results that content search handles that illustrates according to the embodiment of the invention.
Fig. 8 is in the content search that is illustrated in according to the embodiment of the invention, be used at the process flow diagram from the example of the processing that stops to carry out sorting when the schedule time has been passed through in user's operation.
Embodiment
Embodiments of the invention below have been described.To be that example is explained present embodiment with the situation of utilizing image management application to carry out facial search.Yet range of application of the present invention is not limited only to and the search of image-related face.Be not limited to image, and can the present invention be applied to other content such as document data, motion image data and voice data etc.And, to the not special restriction of data form.In addition, the similarity judgement is not limited to use the determination methods of face-image as search key, and the search key that can use is the similarity judgement that literal, image, audio frequency maybe can be used as any other content of search key.
In the present invention, use with as the similarity of the information of search key or the degree of association more than or equal to this fact of particular value as the content search condition.Therefore, can the present invention be applied to and carry out arbitrary content about the judgement of this search condition.Should be noted that for the purpose of simplifying the description, the example based on the image management application of face-image search below has been described.
Fig. 1 is the figure that the system architecture of present embodiment is shown.To be installed in the messaging device 100 with the corresponding image management application of present embodiment.This messaging device is implemented as for example personal computer (PC).Yet, be not limited to PC, and messaging device can be any out of Memory treatment facility that can come search content based on search key, for example mobile phone, digital camera, digital camera, smart phone, media player or game terminal etc.
CPU 101 is processing units of the operation carried out of control information treatment facility 100.Notice that in the present embodiment, CPU 101 can be with the search processing that acts on search content or is used for ordering processing unit that Search Results is sorted.Indicative control unit 102 is to be used to that display unit 105 is shown and the display controller of the corresponding content search process result of present embodiment.Memory device 103 is made up of RAM etc., and be used for storing the performed program of CPU 101 and handle the data of using, and as the perform region of CPU 101.Secondary memory device 104 is made up of hard disk etc., and storage is by the program of CPU 101 operations.Image management application is in these programs.And, with the object search content stores in secondary memory device 104.Program in the secondary memory device 104 is read out to memory device 103 and carried out by CPU101.Display unit 105 is the display devices such as LCD etc.Operating unit 106 is made up of keyboard or mouse etc. and accepts the input operation from the user.
Notice that although messaging device 100 can also be used as separate equipment, messaging device 100 can be as the server that is connected to customer equipment via network.Be used as under the situation of server the content data base that server is managed the appointment and the search server of search condition from customer equipment acceptance.This content data base is corresponding with secondary memory device 104.The content data base that server is managed can be at the diverse location place management and the memory device that can conduct interviews via network.And, need the object search content not concentrated in the memory device, and can carry out record with the mode that is dispersed in a plurality of memory devices.
Search Results is sent to customer equipment and at customer equipment side direction user display of search results via network.In this case, the display of customer equipment side is corresponding with display unit 105, and connects indicative control unit 102 and display unit 105 via network (the Internet etc.).Particularly, indicative control unit 102 is also with the communication control unit that acts on Control Network communication.And, owing to accept user's operation via network, thereby also carry out and operating unit 106 corresponding functions in the customer equipment side.Although following the explanation as embodiment makes messaging device 100 as the situation of autonomous device work, when messaging device 100 based on above-mentioned supposition during as server, messaging device 100 can be worked in the same manner.
Under the situation of searching for as server, the present invention also comprises for example following situation, wherein, creates with the tabulation of the tabulation of the corresponding downscaled images of content that is found or identifying information and is sent to client through server.And; Under the situation of searching for as server; The present invention also comprises following situation: actual find interior perhaps is sent to client with the corresponding identifying information of the content that is found as required, and creates the tabulation of table data or downscaled images by client.
The processing that the messaging device 100 of use and the corresponding image management application of present embodiment is carried out below has been described.At first, Fig. 3 is illustrated in the key frame that shows under the situation of messaging device 100 carries out image management and application program.As one of its function, image management application shown in Figure 3 can be carried out facial search.Although the detailed process of facial search will explain in the back, basically when selection is used as the face-image of benchmark, to be judged as comprise with selected face-image in the identical personage's of personage image search for.
For example, if the user selects specific personage's " Hanako " face-image and carries out facial search, then will be judged as the image that comprises Hanako and be shown as Search Results by image management application.Image management application has facial zone abstraction function and similarity computing function.In the facial zone abstraction function, extract facial zone through from image, extracting facial local feature element and generating configuration information.Configuration information is also as characteristic quantity, and in the similarity computing function, compares through the characteristic quantity with benchmark image and object images and calculates similarity.If the similarity that is calculated more than or equal to predetermined threshold, then shows this image in Search Results as similar image.Notice that the technology of searching for about face can be a technique known, and because this technology is not an essential technical characterictic of the present invention, with the detailed description that does not provide this technology.
Note, judge, can use the technology of from the object search content, extracting with the search key information corresponding, and judge whether to satisfy search condition about the similarity in the search except that facial search.
In Fig. 3, select the file that the display image management and application program is managed in the zone 301 at file.Thumbnail viewing area 303 is used for showing the image in the file that file selection zone 301 is selected by the user.Under situation shown in Figure 3, selected file 302, and the image in folder 302 just.Menu 304 is used to show the menu item that can be selected by the user, and under situation shown in Figure 3, " facial search " just is being shown as and selecting the candidate.Can carry out facial search through in this menu, selecting " facial search ".
Note, admissible removing " facial search " but the example of user's options comprise " being taken the photograph body search ", " color search " and " keyword search " etc.Here, " being taken the photograph body search " refers to following processing: the quilt that occurs in the recognition image is taken the photograph the type of body, and to comprising that taking the photograph the similar quilt of body with the quilt that is identified takes the photograph the image of body and search for.And; " color search " refers to following processing: for example use maximum colors (representative color) in the computed image; Perhaps calculate through color value and average the average color that is obtained, and the image with same or analogous representative color or average color is searched for image.In addition, " keyword search " refers to following processing: Word message is accepted as search condition, and based on the information that can extract from image self or be attached to attributes of images information whether carry out about image be the judgement with the corresponding image of Word message.Keyword can be any information, is taken the photograph the body title or takes the place such as color, quilt.No matter selected in these which, can carry out the processing same with following processing.
The example of the demonstration of the image management application when Fig. 4 illustrates from menu 304 selections " facial search ".When having selected menu 304, show panel 401 is set., panel 401 is provided with the control that is used to carry out facial setting of searching in being set.Selecting zone 402 to be used for showing with selectable mode can be as the face-image of benchmark.The tabulation of the face-image that in the present embodiment, the display image management and application program is managed in selecting zone 402.The method that is used for face-image is registered in facial management and application program can be general image register method, therefore will not provide its explanation.
The user selects from face-image tabulation will be as the face-image of search key, thus can be judged as comprise with selected face-image in the identical personage's of personage image search for.Display of search results image in Search Results viewing area 406.Before sorting, image shows with the order that finds as Search Results.List box 404 is controls of the clooating sequence that will use when being used to select display of search results.According to using list box 404 selected clooating sequences that the Search Results image is carried out sequencing display.Under situation shown in Figure 4, clooating sequence is set to " the most similar ", therefore, when carrying out image when ordering, the highest order display image of similarity when carrying out facial search.In other words, with the highest order of the similarity of user-selected benchmark face-image 403 with the image ordering and show.The example of the item that can in list box 404, select that can consider comprises nearest shooting date and filename ascending order or descending.Search button 405 is to be used to accept to search for the button that begins to indicate, and after the user has selected benchmark face-image and clooating sequence, will begin to search for the time, operation search button 405.
Then will be with reference to figure 2 explanations and the corresponding facial searching disposal of present embodiment.Fig. 2 is a corresponding process flow diagram of example with facial searching disposal.Realize and the corresponding processing of this process flow diagram through CPU 101 carries out image management and application program.
If the user has operated search button 405, then carry out the face-image search according to search routine shown in Figure 2.Explain that below selecting " the most similar " with user's selection reference image 403 and in list box 404 is example as the situation of clooating sequence.If supress search button 405, then in step S201, the inner variable n that keeps of CPU 101 image management application is set to 1.Here, n is the label with respect to the total N of object search image, and can have the value of 1 ~ N.In other words, the sum of object search image is N, and currently n image in N the image is carried out searching disposal.Object search image in the present embodiment is user's image in the selected file in image management application.In other words, all images in Fig. 3 in the selected file 302 is an object search, and the sum of these images is N.In this explanation, think that the image 307 among Fig. 3 is images of n=1.
In step S202, CPU 101 obtains current time and the variable of substitution constantly t1.Here the current time that is obtained after state and be used to judge the moment of carrying out image ordering in the step.And the current time of supposing here to be obtained is to obtain through obtaining from starting image management application institute elapsed time, and supposes that this current moment is the value such as 1234.567 seconds etc.
Then, in step S203, CPU 101 judges in the current image that carries out searching disposal face whether occurs.If face in this image, do not occur, then do not carry out the face search, therefore do not occurring under the facial situation, process gets into step S207.For example, under the situation that facial picture with scenes do not occur such as image 305 and 306 etc., process gets into step S207.Above-mentioned facial zone abstraction function through image management application carries out in image, whether occurring facial judgement.Owing to face in image 307, occurs, thereby process gets into step S204.
In step S204, CPU 101 calculates the similarity between the face-image that benchmark face-images and current carry out searching disposal.Particularly, calculate user-selected benchmark image 403 and as the similarity between the image 307 of first image.Above-mentioned similarity computing function through image management application is carried out two calculation of similarity degree between the image.In the present embodiment, similarity can be got 0 ~ 100 value, and the value of being somebody's turn to do is high more, and this face-image and benchmark image are approaching more.
Then, in step S205, whether the similarity that calculates among the CPU 101 determining step S204 is more than or equal to predetermined threshold (Th1).For example, be set at threshold value Th1 under 70 the situation, can only show similarity more than or equal to 70 image as Search Results.In other words, under the high situation of threshold value, the quantity of Search Results is little, but the image more similar with benchmark image is shown as Search Results.On the contrary, under the low situation of threshold value, the quantity of Search Results is bigger, but will also be shown as Search Results with the not too similar image of benchmark image.This threshold value can be by the predetermined fixed value of image management application, and perhaps the user can this threshold value be set to arbitrary value.If more than or equal to threshold value Th1, then handling, similarity gets into step S206.
In step S206, CPU 101 is shown as Search Results with the current image that carries out searching disposal.Display of search results image in Search Results viewing area 406 shown in Figure 4.If the similarity that calculates among the step S204 is less than threshold value Th1, then process gets into step S207 from step S205, and the image that therefore will not carry out searching disposal is shown as Search Results.Here, benchmark face-image 403 and current carry out similarity between the image 307 of searching disposal be 91 and step S205 in used threshold value be under 70 the situation, image 307 is shown as Search Results.Then, in step S207,101 couples of variable n of CPU value added 1.As previously mentioned, n representes to carry out the label of the image of searching disposal, and through being object search to n value added 1 with next image setting.In the present embodiment, select and handle the image among the total N of object search image in order.
Then, in step S208, whether the value of CPU 101 judgment variable n the total N of great-than search object images.Under the situation of value of n, accomplished searching disposal to all images, so process gets into step S212 greater than N.Value at n is less than or equal under the situation of N, and residue is not carried out the image of searching disposal as yet, and therefore, process gets into step S209.In step S209, CPU 101 obtains current time and the variable of substitution constantly t2.The same with above explanation, the current time of supposing here to be obtained is from starting the image management application elapsed time, and supposes that this is the value such as 1235.678 seconds etc. constantly.
Then, in step S210, CPU 101 calculates poor between the t2 that obtained and the t1, and judge should difference whether more than or equal to ordering corresponding special time T at interval.For example, be that the above-mentioned t2 and the difference between the t1 that are calculated are 1.111 seconds under 7 seconds the situation at interval in ordering, and therefore the value of T is bigger.In this case, process gets into step S203, and next image is carried out searching disposal.Display of search results in order by this way, and till having passed through predetermined space T, just carry out the image ordering.Particularly, be under 7 seconds the situation at interval in ordering, in 7 seconds, do not sort.
After step S210 gets into step S203, carry out picture search in process with the mode identical with above-mentioned flow process.For example, second image is being carried out under the situation of searching disposal, process gets into step S209, and the current time that in step S209, is obtained is 1236.789 seconds, and the difference between t2 and the t1 is 2.222 seconds.Yet this differs from the ordering interval less than 7 seconds, and in this case, process gets into step S203 once more.Along with repeating picture search by this way, the difference in step S210 between t2 and the t1 becomes more than or equal under 7 seconds the ordering situation at interval, and process gets into step S211.In step S211,101 pairs of Search Results of CPU sort.
Ordering below with reference to Fig. 5 A ~ 5C explanation Search Results.Fig. 5 A illustrates adjacent image management application before ordering, and Fig. 5 B illustrates adjacent image management application after ordering.In Fig. 5 A, in Search Results viewing area 501, show the tabulation of 5 images 502 ~ 506.As Search Results images displayed 502 ~ 506th,, and can think and the corresponding identifying information of content that is found with the corresponding downscaled images of the content that is found.This identifying information must not be a downscaled images, and can be filename, symbol or icon image, or through cut content a part obtained cuts image.And identifying information can be title, temporal information or the note of content.Can also use the combination of these identifying informations.
Notice that as previously mentioned, image 502 ~ 506 shows with the order that finds.Fig. 5 C illustrates the table 520 that provides the table data of the similarity between benchmark image and the image 502 ~ 506 with form.It is image 504 that table 520 illustrates the image with highest similarity.Consider this point, have the sorting position place display image 504 of limit priority when sorting.In other words, in Fig. 5 B, locate to show image with highest similarity by 507 represented positions (picture upper left).Then, the image that has the second high similarity among Fig. 5 C is an image 502.Consider this point, in Fig. 5 B by 508 the expression position display image 502.By this way, after ordering, the position display image of the image 507 ~ 511 in Fig. 5 B.The image that after the position of image 512 shows, finds.
Notice that in Fig. 5 B, excess time, the viewing area 513 was used for the data representing information of the excess time till the displaying contents of frame update, and pause button 514 is used to provide the indication that suspends search.In the present embodiment, will be shown as the expression timer of excess time excess time, also be shown as excess time itself.
As stated, when carrying out image when ordering, show as current search institute's images displayed as a result according to clooating sequence.When the ordering of the image among the step S211 finished, process got into step S202.As previously mentioned, in step S202, obtain current time, and with current time substitution variable t1.By this way, after carrying out the image ordering, process gets into step S202, resets benchmark thus constantly.Repeat above-mentioned steps afterwards.Here, show new-found image through the Search Results that is added into after the ordering.And, in step S202, reset benchmark constantly after, if the difference between t2 and the t1 then sorts in step S211 more than or equal to the ordering interval T once more among the step S210.After all images was accomplished searching disposal, process got into step S212.In step S212,101 pairs of Search Results images of CPU sort.If finish as search under the state of Search Results display image after because in step S211, sort, then a part of image is not sorted, so carry out this ordering.Consider this point, after all images has been carried out searching disposal, in step S212, carry out the image ordering.More than be a series of processing that are used to carry out picture search and ordering.
According to above processing, the Search Results image is not sorted and keep show state up to through till the special time, therefore, the user finds when searching under the situation of expectation content, can easily select this content.
Then, processing under the situation of when the user is searching for, having selected institute's images displayed in the Search Results viewing area is described.Fig. 6 A illustrates the user selects image 602 in the images displayed in Search Results viewing area 601 state.Even, find at once the user under the situation of desired image, can select this image, and can carry out next user's operation through when searching for, also allowing to select the Search Results image.Through coming the alternative image via mouse action, can be when searching for chosen content.
In the example shown in Fig. 6 A, shown in excess time viewing area 513, residue is 3 seconds till carrying out next ordering, and supposes here and under the situation that does not find new Search Results, passed through 3 seconds.In this case, 5 images that shown in the Search Results viewing area 601 are sorted, and, here according to above-mentioned flow process shown in Figure 2, image is sorted according to the similarity of image.Yet the user has selected the selected image of this true expression of image when searching for be the image that the user wants.Consider this point, selected image when the sorting position place explicit user with limit priority is being searched for.The state of the image management application after Fig. 6 B illustrates and sorts.Here, with order display image in Search Results viewing area 603 of high display priority.Selected image 602 in the 604 displayed map 6A of place of the sorting position with limit priority.After image 604, non-selected image is carried out sequencing display according to clooating sequence.Under the situation shown in Fig. 6 B, be illustrated in the image except that image 604, image 605 has highest similarity, and residual image 606 ~ 608th, according to the order of highest similarity.Note, if the user selects pictures different under the state shown in Fig. 6 B, then when carrying out next ordering, at the new image of selecting of the sorting position place explicit user with limit priority.Then, same with other image, according to similarity to before selected image carry out sequencing display.
Then explanation is used to suspend the processing of the ordering of the image that carries out with specified time interval.For example, when carrying out picture search, to pay close attention to and check under the situation of Search Results image, can think to be desirably in and suspend ordering when continuing picture search.Consider this point, shown in the button among Fig. 5 A and the 5B 514, be provided for suspending the button of image ordering.Pressing this button makes the image ordering of carrying out with specified time interval suspend.Note, as stated,, also continue picture search even suspend ordering.And, can after suspending the image ordering, recover ordering.When suspending ordering, button 514 becomes recovery button, and can when suspending ordering, recover the image ordering through pressing recovery button 514.Flow process referring now to the picture search of Fig. 7 explanation under the situation of suspending ordering.Notice that search routine shown in Figure 7 is based on search routine shown in Figure 2, and following explanation will be paid close attention to the difference of itself and flow process shown in Figure 2.Realize through CPU 101 carries out image management and application program with the corresponding processing of process flow diagram shown in Figure 7.
In step S701, the inner variable n that keeps of CPU 101 image management application is set to 1, and variable t3 is set to 0.As previously mentioned, n is the label with respect to the total N of object search image.And t3 is an employed variable when judging whether to sort, and after state under the situation of value of t3 in the step and sort greater than the ordering interval T.Because the processing of step S702 ~ S708 is identical with the processing in the search routine shown in Figure 2, so will no longer explain.In step S709, CPU 101 judges current whether time-out of image ordering.Under the situation of not pressing the pause button 514 shown in Fig. 5 B, process gets into step S710.
In step S710, CPU 101 obtains current time, and with current time substitution variable t2.Then, in step S711, CPU 101 with the value of t3 with through deducting the value addition that t1 obtains from t2, and with substitution t3 as a result.For example, consider that variable t1 is that 1234.567 seconds and variable t2 are 1235.678 seconds situation.Note, in step S701, will be worth 0 substitution variable t3.As the result who carries out aforementioned calculation, the value of t3 is 1.111 seconds.In search routine shown in Figure 2, judge whether to sort through the difference of calculating simply between t2 and the t1, but under situation shown in Figure 7, calculate poor between t2 and the t1, and with income value and t3 addition.Then, in step S712, CPU 101 judges that whether the value of t3 is more than or equal to the ordering interval T.For example, be that 7 seconds and t3 are under 1.111 seconds the situation in the ordering interval T, the value of t3 is less than T, so process gets into step S702, and next image is carried out searching disposal.After getting into step S702, likewise carry out the processing of step S702 ~ S708 with the searching disposal that first image is carried out.
Then, in step S709, judge current whether time-out of image ordering.Under the current situation of not suspending image ordering, likewise handle with aforementioned flow process, and therefore, consider that below the user supresses the situation that pause button and current time-out sort after the searching disposal that first image is carried out finishes.In this case, replace getting into step S710, process gets into step S702 immediately.In other words, the processing of under the situation of current time-out ordering, omitting step S710 ~ S712, and therefore, the value of variable t3 does not increase, and does not sort yet.
By this way, in step S709, judge the current ordering that whether suspends, allow when pressing pause button, to suspend the image ordering thus.And the user presses under the situation of recovery button in current time-out ordering, carry out the processing of step S710 ~ S712, so the value of variable t3 increases also.In this case, with the value substitution variable t3 that suspends before the ordering through addition obtained.Particularly, in previous example, with 1.111 seconds substitution variable t3.The value of variable t3 when in other words, keeping suspending ordering.Then, along with search continues, under the value of variable t3 became situation more than or equal to the ordering interval T in step S712, process got into step S713, wherein carries out the image ordering.After having carried out the image ordering, through being worth the value of 0 substitution variable t3 reflex bit variable t3.Therefore, passing through under the situation of special time, can, the cancellation time-out sort again after indicating from indicate the time period front and back that stop to sort to sort at last based on time-out.Note,, can adjacently after cancellation suspends, sort as the distortion of this processing.
By this way, repeat picture search, and after all images was accomplished searching disposal, process got into step S715.In step S715, CPU 101 is the same with step S709 to judge current whether time-out of image ordering.If current, then under situation about not sorting, finish search in the time point time-out image ordering of all images having been searched for.Under the situation of search routine shown in Figure 2, even search finishes also to carry out the image ordering, but in flow process shown in Figure 7, if supress pause button, even then search finishes also not sort.In addition, if when search finishes, do not press pause button, then process gets into step S716 and likewise carries out the image ordering with flow process shown in Figure 2.
According to above processing, keep the show state of Search Results image owing to suspending indication, therefore, the user finds when searching under the situation of expectation content, can easily select this content.
Then will be used for when carrying out content search, showing the processing of the excess time till carrying out next ordering with reference to figure 6A explanation.For example; When the user attempts searching for, select under the situation of the content in the Search Results; If adjacent before ordering chosen content, then think and sort in the mouse action etc., and the image that will select will move to different positions carrying out the user.Consider this point, when carrying out content search, show to make it possible to excess time till carrying out next ordering the indication to time that can chosen content is provided to the user.Two examples of the method that is used to show excess time below have been described.First is the method that shown in the zone among Fig. 6 A 513, directly will be shown as character string excess time.In the example shown in Fig. 6 A, show 3 seconds excess time, clear thus illustrating at 3 seconds laggard line orderings.And, be that unit shows the excess time till carrying out next ordering with 1 second in this case.Although can need be that unit upgrades character string excess time with 0.001 second in this case, and this demonstration possibly bother concerning the user for example 0.001 second to be that unit shows excess time.Considering this point, in this case, was that unit shows excess time with 1 second.
The second method that shows excess time is a display icon and according to the method for controlling excess time shown in same regional 513.In 7 lamps in the example shown in Fig. 6 A, four lamps on the left side are lighted, and three lamps on the right are not lighted.Here, the fact of passing through from last image ordering 4 seconds is shown, and the fact that till will carrying out next ordering, remained 3 seconds is shown by the quantity of the lamp of not lighting by the quantity of the lamp of lighting.
In other words, all lamps are not all lighted when beginning to search for, and then, begin every through just lighting a lamp in 1 second from the left side., all lamps carry out content ordering then when being lighted.After sorting, all lamps extinguish, and repeat lighting of lamp according to identical flow process.By this way, represent the excess time till carrying out next ordering, make the user can understand the excess time till carrying out next ordering intuitively thus according to the quantity of the lamp of being lighted or not lighting.
Note,,, can confirm the interval that lamp is lighted according to the quantity of ordering interval and lamp so whenever just light lamp through 1 second although because ordering in this case is the quantity of 7 seconds and lamp at interval is 7.For example, be that the quantity of 20 seconds and lamp is under 5 the situation at interval in ordering, whenever lighted lamp through 4 seconds and get final product.
Handle below then will explaining: when when searching for, carrying out the image ordering, carry out the image ordering during schedule time from stopping to carry out user's operation process with reference to figure 8 at every turn.For example, when the user is searching for, find under the situation of the image that will pay close attention to and check, think that the user will attempt to select just in images displayed through carrying out mouse or keyboard operation.If when carrying out this user's operation, image is sorted, can think that then the image that the user is just attempting to select will move to different positions, and will select wrong image.Consider this point, can be through passing through under the situation of the schedule time image sorted and avoiding the problems referred to above from stopping to carry out user's operation.
Below be specifying of search routine shown in Figure 8.Notice that search routine shown in Figure 8 is based on picture search flow process shown in Figure 2, and because only the processing of step S810 is different with flow process shown in Figure 2, so the processing of step S810 is paid close attention in following explanation.Because the processing of step S801 ~ S809 is identical with the processing in the search routine shown in Figure 2, thereby will no longer explain.Then, in step S810, CPU 101 judges between t1 and t2, whether to have carried out user's operation.Notice that the user of indication operation here is the operation that the operating unit 106 to the messaging device of searching for 100 is carried out.The example of this operation comprises mouse action and keyboard operation.An example that is used to carry out the method for this judgement is following method: under the situation of having carried out user's operation between t1 and the t2; Image management application is provided with inner mark, and in step S810, judges whether to have carried out user's operation through indicating with reference to this.
Here, if between t1 and t2, do not carry out user's operation, then process gets into step S811.Below step S811, identical with aforementioned search routine shown in Figure 2, calculate poor between variable t2 and the t1, and judge that whether the difference that is calculated is more than or equal to the ordering interval T.Note, time interval T can be with step S210 shown in Figure 2 in the identical value of time interval T, perhaps can be different values.Then; Under the situation of the difference between variable t2 and the t1 more than or equal to the ordering interval T; Process gets into step S812, in step S812, carries out the image ordering, and under the situation of the difference between variable t2 and the t1 less than the ordering interval T; Process gets into step S803, and next image is carried out searching disposal.
Here, suppose and before the searching disposal that first image is carried out finishes, do not carry out user's operation, when second image is carried out searching disposal, carry out user's operation then.In this case, after the searching disposal that second image is carried out finished, process got into S802 from step S810.After getting into step S802,, reset the benchmark moment that is used to judge the moment of sorting thus with current time substitution variable t1.Then, process gets into step S803, and next image is carried out searching disposal.Afterwards, search for, and under the situation of not carrying out user's operation between t1 and the t2, carry out the image ordering through repeating above-mentioned flow process.Then, after all images has been carried out searching disposal, in step S813, carry out the image ordering.
According to above processing, if the user has operated operating unit 106, then can keep the show state of Search Results image, therefore,, the user finds when searching under the situation of expectation content, can easily select this content.Note, can realize separately, perhaps can realize the combination in any of these treatment schemees with Fig. 2,7 and 8 the corresponding treatment scheme of process flow diagram.
Although abovely based on embodiment the present invention has been described, the present invention is not intended to be subject to these specific embodiments, and the various embodiment that do not deviate from purport of the present invention are also included among the present invention.The part of the foregoing description can appropriate combination.
Other embodiment
Can also utilize read and the program of executive logging on storage arrangement with the computing machine (perhaps device such as CPU or MPU) of the system or equipment of the function of carrying out the foregoing description and through following method realize of the present invention aspect; Wherein, the computing machine that utilizes system or equipment is through for example reading and the program of executive logging on storage arrangement carried out the step of said method with the function of carrying out the foregoing description.For this reason, for example, this program is offered computing machine through network or through various types of recording mediums (for example, computer-readable medium) as storage arrangement.
Although the present invention has been described with reference to exemplary embodiments, should be appreciated that, the invention is not restricted to disclosed exemplary embodiments.The scope of appended claims meets the wideest explanation, to comprise all this type modifications, equivalent structure and function.

Claims (7)

1. messaging device, it is used to carry out content search, and said messaging device comprises:
Search unit is used for the content that meets predetermined search condition is searched for;
Generation unit; Be used to generate and table data through the corresponding identifying information of content that said search unit found; Wherein, When said search unit is just being searched for, said generation unit will be added into the table data that is generated till current with the corresponding identifying information of new-found content; And
Sequencing unit is used for repeating to sort with the sequence of predetermined timing to the included identifying information of said table data according to sort criteria,
Wherein, till carrying out next ordering, said generation unit will be added into said table data with the corresponding identifying information of the new-found content of said search unit with the mode not according to said sort criteria when said sequencing unit sorts.
2. messaging device according to claim 1; It is characterized in that; Receive under the situation to the selection of the arbitrary identifying information the said table data the user from said messaging device, said sequencing unit is increased to the priority that is higher than other content with the priority of selected identifying information in said table data in said ordering.
3. messaging device according to claim 1 and 2; It is characterized in that; Receive under the situation that stops to indicate of the ordering that said sequencing unit is carried out user from said messaging device; Said sequencing unit stops said ordering till further receiving the said cancellation that stops to indicate, and said generation unit will with receive said being added into by the corresponding identifying information of the new-found content of said search unit after stopping to indicate and receive the said table data that is generated when stopping to indicate.
4. messaging device according to claim 1 and 2 is characterized in that, said generation unit generates said table data together and representes that said sequencing unit carries out the video data in the moment of next ordering processing.
5. messaging device according to claim 1 and 2 is characterized in that, also comprises:
Storage unit is used to store said content; And
Display unit is used to show said table data.
6. messaging device according to claim 1 and 2 is characterized in that,
Said content is a view data,
As said search condition, the view data that said search unit searches is similar with the benchmark image data, and
Said sequencing unit sorts according to the similarity of said similar view data.
7. information processing method that is used to carry out content search comprises:
Search step is used for the content that meets predetermined search condition is searched for;
Generate step; Be used for generating with in the table data of the corresponding identifying information of content that said search step found; Wherein, When in said search step, just searching for, in said generation step, will be added into the table data that generates till current with the corresponding identifying information of new-found content; And
Ordered steps is used for repeating ordering according to sort criteria with the sequence of the predetermined identifying information that regularly said table data is comprised,
Wherein, Till carrying out next ordering, in said generation step, will be added into said table data with the corresponding identifying information of new-found content in said search step when said ordered steps, sorting with mode not according to said sort criteria.
CN201210149622.1A 2011-05-13 2012-05-14 Messaging device and information processing method Expired - Fee Related CN102779153B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011108732A JP5693369B2 (en) 2011-05-13 2011-05-13 Information processing apparatus, control method thereof, and computer program
JP2011-108732 2011-05-13

Publications (2)

Publication Number Publication Date
CN102779153A true CN102779153A (en) 2012-11-14
CN102779153B CN102779153B (en) 2015-11-25

Family

ID=47124065

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210149622.1A Expired - Fee Related CN102779153B (en) 2011-05-13 2012-05-14 Messaging device and information processing method

Country Status (3)

Country Link
US (2) US20120290589A1 (en)
JP (1) JP5693369B2 (en)
CN (1) CN102779153B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106933855A (en) * 2015-12-30 2017-07-07 阿里巴巴集团控股有限公司 Object order method, apparatus and system
CN107221107A (en) * 2014-01-08 2017-09-29 东芝泰格有限公司 Information processor and its control method

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120118383A (en) * 2011-04-18 2012-10-26 삼성전자주식회사 Image compensation device, image processing apparatus and methods thereof
JP2014139734A (en) * 2013-01-21 2014-07-31 Sony Corp Information processing device and method, and program
KR102099400B1 (en) * 2013-06-20 2020-04-09 삼성전자주식회사 Apparatus and method for displaying an image in a portable terminal
CN103514254A (en) * 2013-07-04 2014-01-15 李文博 Image set ordering method for mining hidden operation behavior
JP5767413B1 (en) * 2014-03-18 2015-08-19 楽天株式会社 Information processing system, information processing method, and information processing program
EP3112986B1 (en) 2015-07-03 2020-02-26 Nokia Technologies Oy Content browsing
WO2017026172A1 (en) * 2015-08-10 2017-02-16 日本電気株式会社 Display processing device and display processing method
US9971847B2 (en) * 2016-01-07 2018-05-15 International Business Machines Corporation Automating browser tab groupings based on the similarity of facial features in images
JP6686770B2 (en) * 2016-07-28 2020-04-22 富士ゼロックス株式会社 Information processing device and program
CN106649069B (en) * 2016-12-28 2019-12-13 Tcl集团股份有限公司 User behavior statistical method and system
JP7419790B2 (en) * 2019-12-18 2024-01-23 大日本印刷株式会社 Rename processing equipment and print sales system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030227468A1 (en) * 2002-06-07 2003-12-11 Mayumi Takeda Image processing apparatus, image processing method and program
US20090249200A1 (en) * 2008-03-31 2009-10-01 Brother Kogyo Kabushiki Kaisha Image processor
CN101617299A (en) * 2007-01-05 2009-12-30 索尼爱立信移动通讯股份有限公司 Data base management method
US20100217995A1 (en) * 2009-02-23 2010-08-26 International Business Machines Corporation Data structure, computer system, method and computer program for searching database

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004062916A (en) * 2002-06-05 2004-02-26 Sony Corp Information processor and its method, recording medium, and program
US6988098B2 (en) * 2003-04-24 2006-01-17 Microsoft Corporation Grid data processing systems and methods
JP2008102594A (en) * 2006-10-17 2008-05-01 Fujitsu Ltd Content retrieval method and retrieval device
US8423536B2 (en) * 2008-08-05 2013-04-16 Yellowpages.Com Llc Systems and methods to sort information related to entities having different locations
US8572025B2 (en) * 2008-12-23 2013-10-29 Tau Cygnus, Llc Data management system for portable media devices and other display devices
US8185526B2 (en) * 2010-01-21 2012-05-22 Microsoft Corporation Dynamic keyword suggestion and image-search re-ranking
US20110191336A1 (en) * 2010-01-29 2011-08-04 Microsoft Corporation Contextual image search

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030227468A1 (en) * 2002-06-07 2003-12-11 Mayumi Takeda Image processing apparatus, image processing method and program
CN101617299A (en) * 2007-01-05 2009-12-30 索尼爱立信移动通讯股份有限公司 Data base management method
US20090249200A1 (en) * 2008-03-31 2009-10-01 Brother Kogyo Kabushiki Kaisha Image processor
US20100217995A1 (en) * 2009-02-23 2010-08-26 International Business Machines Corporation Data structure, computer system, method and computer program for searching database

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107221107A (en) * 2014-01-08 2017-09-29 东芝泰格有限公司 Information processor and its control method
CN107221107B (en) * 2014-01-08 2020-01-31 东芝泰格有限公司 Information processing apparatus and control method thereof
CN106933855A (en) * 2015-12-30 2017-07-07 阿里巴巴集团控股有限公司 Object order method, apparatus and system
CN106933855B (en) * 2015-12-30 2020-06-23 阿里巴巴集团控股有限公司 Object sorting method, device and system

Also Published As

Publication number Publication date
JP2012242854A (en) 2012-12-10
CN102779153B (en) 2015-11-25
US20120290589A1 (en) 2012-11-15
US20160179881A1 (en) 2016-06-23
JP5693369B2 (en) 2015-04-01

Similar Documents

Publication Publication Date Title
CN102779153A (en) Information processing apparatus and information processing method
JP5358083B2 (en) Person image search device and image search device
US10073861B2 (en) Story albums
JP4328757B2 (en) PROGRAM SELECTION DEVICE AND PROGRAM SELECTION DEVICE CONTROL METHOD
US9928397B2 (en) Method for identifying a target object in a video file
EP2613549A1 (en) Display apparatus, remote control apparatus, and searching methods therof
EP4206887A1 (en) Multimedia object arrangement method and apparatus, electronic device, and storage medium
US20090327891A1 (en) Method, apparatus and computer program product for providing a media content selection mechanism
CN104881451A (en) Image searching method and image searching device
CN106407358B (en) Image searching method and device and mobile terminal
CN112818141A (en) Searching method and device
EP3413219A1 (en) Search method and device
US9224069B2 (en) Program, method and apparatus for accumulating images that have associated text information
JP2015032905A (en) Information processing device, information processing method, and program
TWI695275B (en) Search method, electronic device and computer-readable recording medium
KR20150101846A (en) Image classification service system based on a sketch user equipment, service equipment, service method based on sketch and computer readable medium having computer program recorded therefor
JP2006217046A (en) Video index image generator and generation program
CN109756759B (en) Bullet screen information recommendation method and device
JP2010244425A (en) Information processing apparatus and method, program, and storage medium
JP2016122413A (en) Image processing apparatus, control method of image processing apparatus, and program
JP6704680B2 (en) Display device, information processing program, and information processing method
CN113297416B (en) Video data storage method, apparatus, electronic device, and readable storage medium
JP6865537B2 (en) Information processing equipment, information processing methods, and programs
JP2006313497A (en) Apparatus and method for retrieving image
CN114245174A (en) Video preview method and related equipment

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20151125