US8266529B2 - Information processing device and display information editing method of information processing device - Google Patents

Information processing device and display information editing method of information processing device Download PDF

Info

Publication number
US8266529B2
US8266529B2 US12/463,693 US46369309A US8266529B2 US 8266529 B2 US8266529 B2 US 8266529B2 US 46369309 A US46369309 A US 46369309A US 8266529 B2 US8266529 B2 US 8266529B2
Authority
US
United States
Prior art keywords
display
processing
contact
region
range
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related, expires
Application number
US12/463,693
Other versions
US20090287999A1 (en
Inventor
Noriko OOI
Keiichi Murakami
Kentaro Endo
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NTT Docomo Inc
Original Assignee
NTT Docomo Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by NTT Docomo Inc filed Critical NTT Docomo Inc
Assigned to NTT DOCOMO, INC. reassignment NTT DOCOMO, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ENDO, KENTARO, MURAKAMI, KEIICHI, OOI, NORIKO
Publication of US20090287999A1 publication Critical patent/US20090287999A1/en
Application granted granted Critical
Publication of US8266529B2 publication Critical patent/US8266529B2/en
Expired - Fee Related legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1643Details related to the display arrangement, including those related to the mounting of the display in the housing the display being associated to a digitizer, e.g. laptops that can be used as penpads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/033Indexing scheme relating to G06F3/033
    • G06F2203/0338Fingerprint track pad, i.e. fingerprint sensor used as pointing device tracking the fingertip image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/22Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector

Definitions

  • This invention relates to an information processing device, and to a display information editing method of such an information processing device.
  • Portable telephone terminals, PDAs (Personal Digital Assistants), and other information processing devices comprise a keyboard, mouse, operation buttons, or other operation portions, and a display or other display portion, and can perform text/image information processing in which text information is displayed on a display portion according to information input by an operation portion, or text information and image information or similar, stored in advance, are displayed on a display portion according to operation by an operation portion.
  • Japanese Patent Application Laid-open No. 2000-311040 an information processing device of this type is described.
  • This information processing device can be used to specify a display information range by operating a cursor on the display portion by means of a mouse, and then, with the cursor within the specified range, performing a clicking operation with the mouse to perform copying of the display information.
  • Using this information processing device only a range specification operation and a single click operation are performed, and the user need only move the mouse slightly between these two operations, so that the ease of operation of display information copying can be improved.
  • Japanese Patent Application Laid-open No. 2000-311040 the possibility of application of a touch panel as the operation portion of the information processing device is described.
  • an object of this invention is to provide an information processing device and a display information editing method of an information processing device enabling improved ease of operation for a plurality of display information editing functions.
  • An information processing device of this invention comprises (a) display means for displaying information; (b) operation means, having a contact region corresponding to a display region of the display means, for detecting a contact operation in the contact region and outputting a position signal indicating the position of the contact operation in the contact region; (c) range decision means for deciding a target range of the display region, based on position signals for two points from the operation means; (d) processing menu display means for causing the display means to display a processing menu for selection of various processing, at a position corresponding to the target range of the display region decided by the range decision means; (e) processing decision means for deciding processing of the processing menu corresponding to the position of contact operation in the contact region, based on the position signal for a third point from the operation means; and (f) processing execution means for executing the processing decided by the processing decision means on the target range of the display region decided by the range decision means.
  • a display information editing method of an information processing device of this invention is a display information editing method of an information processing device which has display means for displaying information and operation means having a contact region corresponding to a display region of the display means, in which (a) a contact operation in the contact region is detected, and a position signal corresponding to the position of the contact operation in the contact region is generated; (b) a target range of the display region is decided based on the position signals for two points; (c) the display means is caused to display a processing menu for selection of various processing, at a position corresponding to the decided target range of the display region; (d) processing of the processing menu corresponding to the position of contact operation in the contact region is decided, based on the position signal for a third point; and (e) the decided processing is executed on the decided target range of the display region.
  • processing menu display means causes display means to display a processing menu at a position corresponding to a display region target range decided by range decision means in order to select various processing, so that the user need only perform a range specification operation using two fingers and a processing specification operation using a third finger. Hence the ease of operation of a plurality of display information editing functions can be improved.
  • processing menu display means cause the display means to display the processing menu at a position set in advance for the target range of the display region.
  • the ease of operation is different for the right hand and for the left hand. And, the ease of operation is different depending on the two fingers used for range specification and the third finger used for processing specification.
  • the processing menu be displayed in the center of the portion below the specified range, or displayed to the right thereof.
  • the processing menu M be displayed in the center of the portion above the specified range.
  • the above-described information processing device further comprise identification means for identifying whether a hand performing the contact operation in the contact region is the right hand or the left hand, and for identifying two fingers of the contact operation, and that the above-described processing menu display means cause the display means to display the processing menu at a position relative to the target range of the display region associated with the combination of the hand and the two fingers, based on the hand and the two fingers identified by the identification means.
  • range specification was performed by the right hand or the left hand is identified by the identification means, and the two fingers of the range specification are identified; for example, when range specification is performed by the little finger and index finger of the left hand, by displaying the processing menu on the right side of the portion below the specified range, the ease of operation by the thumb of the right hand can be improved. And, when for example range specification is performed by the little finger and thumb of the right hand, by displaying the processing menu in the center of the portion above the specified range, ease of operation by the index finger of the right hand can be improved. Hence ease of operation can be improved without depending on the hand and fingers used for contact operations. Also, a user can perform operations with one hand, so that ease of operation can be further improved.
  • the above-described range decision means re-decides the target range of the display region.
  • the above-described range decision means decide the target range of the display region only when position signals for two points are received simultaneously from the operation means.
  • an information input operation due to single-point contact and a range specification operation due to simultaneous two-point contact can be discriminated, so that display information editing can be caused to be performed even during an information input operation.
  • FIG. 1 is a perspective view showing the configuration of the information processing device of a first aspect of the invention
  • FIG. 2 is a block diagram showing the configuration of the information processing device of the first aspect of the invention
  • FIG. 3 shows range decision processing by the range decision portion of FIG. 2 ;
  • FIG. 4 shows processing menu display processing by the processing menu display portion, range adjustment processing by the range decision portion, and editing processing decision processing by the processing decision portion of FIG. 2 ;
  • FIG. 5 shows editing processing execution processing by the processing execution portion of FIG. 2 ;
  • FIG. 6 is a flowchart showing the display information editing method of the information processing device of the first aspect of the invention.
  • FIG. 7 is a block diagram showing the configuration of the information processing device of a second aspect of the invention.
  • FIG. 8 shows the range decision processing of the range decision portion of FIG. 2 ;
  • FIG. 9 shows the processing menu display processing of the processing menu display portion, the range adjustment processing of the range decision portion, and the editing processing decision processing of the processing decision portion of FIG. 2 ;
  • FIG. 10 shows the editing processing execution processing of the processing execution portion of FIG. 2 ;
  • FIG. 11 is a flowchart showing the display information editing method of the information processing device of the second aspect of the invention.
  • FIG. 12 is a block diagram showing the configuration of the information processing device of a modified example of the invention.
  • FIG. 13 is a perspective view showing, partially exploded, the configuration of the information processing device of a modified example of the invention.
  • FIG. 14 shows a table in which the pressure intensities and contact areas of the fingers of the right and left hands are associated with the fingers of the hands;
  • FIG. 15 shows the positional relationship of contact operations
  • FIG. 16 shows the distribution of the contact area and pressure intensity of contact operations
  • FIG. 17 shows an example of hand and finger identification processing and processing menu display processing in a modified example of the invention
  • FIG. 18 shows an example of hand and finger identification processing and processing menu display processing in a modified example of the invention
  • FIG. 19 shows an example of hand and finger identification processing and processing menu display processing in a modified example of the invention.
  • FIG. 20 shows an example of hand and finger identification processing and processing menu display processing in a modified example of the invention
  • FIG. 21 shows an example of hand and finger identification processing and processing menu display processing in a modified example of the invention
  • FIG. 22 shows an example of hand and finger identification processing and processing menu display processing in a modified example of the invention
  • FIG. 23 shows an example of hand and finger identification processing and processing menu display processing in a modified example of the invention
  • FIG. 24 is a block diagram showing the configuration of the information processing device of a modified example of the invention.
  • FIG. 25 is a perspective view showing, partially exploded, the configuration of the information processing device of a modified example of the invention.
  • FIG. 1 is a perspective view showing the configuration of the information processing device of a first aspect of the invention
  • FIG. 2 is a block diagram showing the configuration of the information processing device of the first aspect of the invention.
  • a portable telephone terminal is shown as an example of an information processing device in FIG. 1 and FIG. 2 .
  • the information processing device 1 of FIG. 1 and FIG. 2 comprises a display portion 10 , operation portion 20 , and control portion 30 .
  • the control portion 30 is accommodated within a housing which accommodates the display portion 10 , or within a housing which accommodates the operation portion 20 .
  • the display portion 10 is for example a liquid crystal display. A plurality of pixels are arranged in row and column directions in the display portion 10 , and the region in which these pixels are arranged is the display region 10 a .
  • the display portion 10 receives text data, image data, and other information from the control portion 30 , and displays this information in the display region 10 a . For example, based on information from the control portion 30 , the display portion 10 causes pixels corresponding to the row and column coordinates (addresses) representing this information to emit light in colors representing the information.
  • the operation portion 20 is for example an electrostatic pad, having a column electrode layer, row electrode layer, and a dielectric layer arranged between the column electrode layer and the row electrode layer.
  • a column electrode layer a plurality of column electrodes are arranged in parallel in the row direction extending in the column direction
  • a plurality of row electrodes are arranged in parallel in the column direction extending in the row direction.
  • the region in which these column electrodes and row electrodes are arranged is the contact region 20 a .
  • the operation portion 20 detects contact operations in the contact region 20 a by the user, and outputs position signals to the control portion 30 indicating the positions of contact operations in the contact region 20 a.
  • portions at which column electrodes and row electrodes intersect each comprise a capacitor, and through the change in the electrostatic capacitance of each capacitor due to a contact operation by the user, the potentials on the column electrodes and row electrodes of the capacitors change.
  • the operation portion 20 detects changes in the electrostatic capacitances of capacitors, that is, changes in the potentials of column electrodes and row electrodes, and detects contact operations by the user on the contact region 20 a .
  • the operation portion 20 outputs to the control portion 30 , as position signals, the row and column coordinates of portions of intersection of the column electrodes and row electrodes the potentials of which have changed.
  • the positions of intersection of column coordinates and row coordinates in the contact region 20 a of the operation portion 20 are associated with pixel positions in the display region 10 a of the display portion 10 .
  • the positions of intersection of row electrodes and column electrodes, and the positions of pixels, need not necessarily be associated in a one-to-one relationship.
  • the control portion 30 has ROM (Read-Only Memory), which stores various programs and various information (transmission/reception mail log character information, image information, mail address information, and similar), a CPU (Central Processing Unit) which executes various program stored in the ROM, and RAM (Random Access Memory) which temporarily stores information, or is used as a working area for execution of various programs.
  • ROM Read-Only Memory
  • CPU Central Processing Unit
  • RAM Random Access Memory
  • FIG. 3 to FIG. 5 show in sequence the processing processes of range decision processing and range adjustment processing by the range decision portion 31 , processing menu display processing by the processing menu display portion 32 , editing processing decision processing by the processing decision portion 33 , and editing processing execution processing by the processing execution portion 34 .
  • the range decision portion 31 decides the target range A 1 of the display region 10 a based on position signals for two points from the operation portion 20 .
  • the range decision portion 31 decides the target range A 1 of the display region 10 a which takes as the starting point and ending point the two positions P 1 a , P 1 b respectively of the display region 10 a corresponding to the two positions P 2 a , P 2 b of the contact operation in the contact region 20 a.
  • the positions of intersection of column electrodes and row electrodes in the contact region 20 a of the operation portion 20 and pixels in the display region 10 a of the display portion 10 are stored in advance in association as a table.
  • the range decision portion 31 determines the two positions P 1 a , P 1 b of the display region 10 a corresponding to the two positions P 2 a , P 2 b of the contact operation in the contact region 20 a .
  • the range decision portion 31 decides, as the target range A 1 of the display region 10 a , the range which takes the two positions P 1 a , P 1 b of the display region 10 a thus determined as the starting point and ending point respectively.
  • the range decision portion 31 performs adjustment of the decided target range A 1 .
  • the range decision portion 31 causes the display portion 10 to display the border B indicating the target range A 1 after target range decision, as shown in FIG. 4 , and the operation portion 20 detects contact operations and sliding in the portion of the contact region 20 a corresponding to this border B.
  • the range decision portion 31 receives position signals corresponding to sliding, and again determines the target range.
  • the range decision portion 31 re-decides the range determined again as the target range A 2 when a processing specification operation, described below, is performed.
  • the range decision portion 31 finalizes the target range A 2 when the position N 2 of the contact region 20 a corresponding to the positions of the processing N 1 of the processing menu M in the display region 10 a is selected.
  • the processing menu display portion 32 causes the display portion 10 to display the processing menu M to enable selection of various processing.
  • display positions for the processing menu M corresponding to target ranges A 1 of the display region 10 a are stored in advance in the ROM.
  • the display position of the processing menu M can be set by the user, for example during initial settings or at other times.
  • range specification is performed with the little finger and the index finger of the left hand, and each time the thumb is used to perform processing specification, the display position of the processing menu M is set on the right side in the portion below the target range A 1 .
  • the processing decision portion 33 decides the processing N 1 in the processing menu M corresponding to the position P 2 c of the contact operation in the contact region 20 a , based on the position signal for a third point from the operation portion 20 .
  • the processing execution portion 34 executes the processing N 1 decided by the processing decision portion 33 on the target range A 2 of the display region 10 a decided by the range decision portion 31 .
  • the processing execution portion 34 performs enlargement editing of the characters displayed in the target range A 2 , as shown in FIG. 5 .
  • FIG. 6 is a flowchart showing the display information editing method of the information processing device of the first aspect of the invention.
  • the display position of the processing menu M relative to the target range of the display region 10 a is set by the user.
  • the information processing device 1 in response to a command input by the user, sets the display position of the processing menu M on the right side of the portion below the target range, and stores the display position thus set for the processing menu M in ROM as for example coordinate data (step S 01 ).
  • the operation portion 20 detects changes in the potentials of the column electrodes and row electrodes due to the contact of the fingers with the contact region 20 a , and outputs to the control portion 30 , as position signals, the column and row coordinates of the intersecting portions of the column electrodes and row electrodes for which potentials have changed (step S 02 ).
  • the range decision portion 31 determines the two positions P 1 a , P 1 b of the display region 10 a corresponding to the two positions P 2 a , P 2 b of the contact operation in the contact region 20 a , based on the table stored in advance in ROM (step S 03 ).
  • the range decision portion 31 decides the target range A 1 of the display region 10 a having as the starting point and ending point the previously determined two positions P 1 a , P 1 b , respectively, of the display region 10 a (step S 04 ).
  • the processing menu display portion 32 causes the processing menu M to be displayed at the display position of the processing menu M corresponding to the target range A 1 of the display region 10 a , stored in advance in ROM.
  • the processing menu display portion 32 causes display of the processing menu M on the right side in the portion below the target range A 1 (step S 03 ).
  • the range decision portion 31 causes the display portion 10 to display, using a border B, the target range A 1 of the display region 10 a .
  • the operation portion 20 detects the finger sliding operation.
  • the range decision portion 31 receives position signals corresponding to the sliding operation, and re-determines the target range of the display region 10 a . In this way, the range decision portion 31 adjusts the target range A 1 of the display region 10 a (step S 06 ).
  • the range decision portion 31 re-decides the range of the display region 10 a after adjustment as the target range A 2 (step S 07 ).
  • the processing decision portion 33 decides on the processing “enlarge/reduce” N 1 in the processing menu M corresponding to the position P 2 c of the contact operation in the contact region 20 a (step S 08 ).
  • the processing execution portion 34 executes the processing N 1 , decided on by the processing decision portion 33 , on the target range A 2 of the display region 10 a decided by the range decision portion 31 .
  • the processing execution portion 34 performs enlargement processing of the characters displayed in the target range A 2 , as shown in FIG. 5 (step S 09 ).
  • the processing menu display portion 32 causes the display portion 10 to display a processing menu M for selection of various processing functions at a position corresponding to the target range A 1 of the display region 10 a , decided by the range decision portion 31 , and so the user need only perform a range specification operation using two fingers and a processing specification operation using a third finger.
  • the ease of operation with respect to a plurality of display information editing functions can be improved.
  • the display position of the processing menu M can be set in advance according to user preferences, so that ease of operation can be further improved. Also, the user can perform operations using one hand, so that ease of operation can be further improved.
  • the target range A 2 of the display region 10 a can be re-decided by the range decision portion 31 , so that by sliding and moving the two fingers used for range specification, the specified range can be adjusted, and so ease of operation can be further improved.
  • FIG. 7 is a block diagram showing the configuration of the information processing device of a second aspect of the invention.
  • the configuration of the information processing device 1 A shown in FIG. 7 differs from that of the first aspect in further comprising an image capture portion 40 in the information processing device 1 , and in comprising, in place of the control portion 30 , a control portion 30 A. Otherwise the configuration of the information processing device 1 A is the same as that of the information processing device 1 .
  • the image capture portion 40 is for example a camera, which captures images of the hand and fingers of the user performing contact operations in the contact portion 20 , and outputs captured image signals to the control portion 30 A.
  • the control portion 30 A differs from the configuration in the first aspect in further having a hand/finger identification portion 35 in the control portion 30 , and in having, in place of the range decision portion 31 and processing menu display portion 32 , a range decision portion 31 A and processing menu display portion 32 A. Otherwise the configuration of the control portion 30 A is the same as that of the control portion 30 .
  • FIG. 8 to FIG. 10 show in sequence the processing processes of range decision processing and range adjustment processing by the range decision portion 31 A, processing menu display processing by the processing menu display portion 32 A, editing processing decision processing by the above-described processing decision portion 33 , and editing processing execution processing by the above-described processing execution portion 34 .
  • the hand/finger identification portion 35 identifies the hand and the two fingers performing the contact operation in the contact region 20 a . For example, by performing image processing of image signals from the image capture portion 40 , the hand/finger identification portion 35 identifies the hand of the contact operation as the right hand or the left hand, and identifies the two fingers of the contact operation among the thumb, index finger, middle finger, ring finger, and little finger.
  • the range decision portion 31 A differs from the range decision portion 31 in deciding the contact operation range D of the contact region 20 a having as starting and ending points the two positions P 2 a , P 2 b respectively of the contact operation, based on the position signals for two points from the operation portion 20 , and thereafter, deciding the target range A 1 of the display region 10 a corresponding to the contact operation range D of the contact region 20 a.
  • contact operation ranges D are stored in advance in ROM in a plurality of conceivable patterns in the contact region 20 a of the operation portion 20 , and a plurality of patterns of target ranges A 1 in the display region 10 a of the display portion 10 associated with the respective contact operation ranges D are stored in advance in ROM. That is, a plurality of patterns of contact operation ranges D and a plurality of patterns of target ranges A 1 are stored in advance in association, as a table, in ROM. Based on this table, the range decision portion 31 A decides the target range A 1 corresponding to a contact operation range D.
  • the range decision portion 31 A performs adjustment of a decided target range A 1 and re-decides a target range A 2 after adjustment, similarly to the range decision portion 31 .
  • the processing menu display portion 32 A causes display of the processing menu M at a position relative to the target range A 1 associated with the combination of the hand and two fingers.
  • a plurality of conceivable patterns of combinations of a hand and two fingers are stored in advance in ROM, and a plurality of patterns of processing menu M display positions are stored in association with each of the combinations of a hand and two fingers. That is, a plurality of patterns of combinations of a hand and two fingers, and a plurality of patterns of processing menu M display positions, are stored in association as a table in ROM. Based on this table, the processing menu display portion 32 A causes the processing menu M to be displayed at the display position associated with the combination of the identified hand and two fingers.
  • processing decision portion 33 and processing execution portion 34 are similar to those of the first aspect, and so explanations are omitted.
  • FIG. 11 is a flowchart showing the display control method of the information processing device of the second aspect of the invention.
  • step S 02 when the user touches the two points P 2 a , P 2 b with the thumb and little finger of the right hand, the processing of the above-described step S 02 is performed. And, the image capture portion 40 captures an image of the user's hand and fingers, and outputs image signals to the control portion 30 A. Then, by performing image processing of the image signals from the image capture portion 40 , the hand/finger identification portion 35 identifies the hand and two fingers used in the contact operation (step S 10 ).
  • the range decision portion 31 A decides the contact operation range D of the contact region 20 a having as starting and ending points the two positions P 2 a , P 2 b respectively of the contact operation (step S 03 A).
  • the range decision portion 31 A decides the target range A 1 of the display region 10 a corresponding to the contact operation range D of the contact region 20 a (step S 04 A).
  • the processing menu display portion 32 A causes the processing menu M to be displayed at the position relative to the target range A 1 associated with the combination of the hand and two fingers identified by the hand/finger identification portion 35 .
  • the processing menu display portion 32 A causes the processing menu M to be displaced in the center of the portion above the target range A 1 corresponding to the combination of the thumb and little finger of the right hand (step S 05 A).
  • the processing of the above-described steps S 06 to S 09 is performed, and enlargement editing of the display image is performed, as shown in FIG. 10 .
  • the hand and two fingers used in range specification are identified by the hand/finger identification portion 35 , so that when for example range specification has been performed using the little finger and thumb of the right hand, by displaying the processing menu in the center of the portion above the specified range, ease of operation by the index finger of the right hand can be improved.
  • ease of operation can be improved without depending on the hand and fingers used for contact operations.
  • the user can perform operations with one hand, so that ease of operation can be further improved.
  • an example of image processing using a camera was described as an identification method for identifying the hand and two fingers of a contact operation; but the identification method is not limited to that of this aspect.
  • an identification method may be employed in which a pressure-sensitive sensor, touch sensor, or similar is used in identification using contact surface pressure.
  • FIG. 12 is a block diagram of an information processing device using a pressure sensor as a pressure-sensitive sensor
  • FIG. 13 is a perspective view showing, partially exploded, the configuration of the information processing device of FIG. 12
  • the configuration of the information processing device 1 B of FIG. 12 and FIG. 13 differs from that of the information processing device 1 A in that, in place of the image capture portion 40 and control portion 30 A in the information processing device 1 A, a pressure-sensitive sensor 40 B and control portion 30 B are comprised.
  • the pressure-sensitive sensor 40 B is a sheet-shape pressure sensor, and as shown in FIG. 13 , is arranged below the operation portion 20 , that is, below the electrostatic pad.
  • the pressure sensor detects pressure intensity of fingers of the user performing a contact operation through the electrostatic pad, and outputs the detected pressure intensity to the control portion 30 B.
  • the control portion 30 B differs from the control portion 30 A in that, in place of the hand/finger identification portion 35 in the control portion 30 A, a hand/finger identification portion 35 B is comprised.
  • the hand/finger identification portion 35 B identifies whether the hand performing a contact operation is the right hand or the left hand, and further identifies the two fingers of a contact operation among the thumb, index finger, middle finger, ring finger, and little finger, from the pressure intensity from the pressure-sensitive sensor 40 B, and through the contact area from the electrostatic sensor in the operation portion 20 .
  • the intensity of the force and the contact area of fingers performing contact operations are greater for the favored hand, right or left, and are larger in the order of the thumb, index finger, middle finger, ring finger, and little finger.
  • the hand and fingers of the contact operation can be identified.
  • a table which associates the pressure intensities and contact areas of the fingers of the right and left hands with the fingers of the hands is stored in advance in ROM, as shown in FIG. 14 .
  • the hand/finger identification portion 35 B identifies the hand and fingers corresponding to the pressure intensities from the pressure-sensitive sensor 40 B and the contact area from the electrostatic sensor in the operation portion 20 as the hand and fingers of a contact operation.
  • This table is for example registered in advance as follows. First, one finger at a time, in order, of the fingers on the right hand and left hand is brought into contact with the operation portion 20 . Next, the five fingers of the right hand are brought into contact simultaneously with the operation portion 20 , and the five fingers of the left hand are brought into contact simultaneously with the operation portion 20 . Then, the five fingers of the right hand and the entire palm of the right hand are brought into contact simultaneously with the operation portion 20 , and the five fingers and entire palm of the left hand are brought into contact simultaneously with the operation portion 20 .
  • each contact operation two cases are performed which are the case of light touching and the case of forceful pressing; the average values of the pressure intensity and contact area in each contact operation for the case of light touching are taken to be minimum values and are registered in the table, associated with the respective fingers, and the average values of the pressure intensity and contact area in each contact operation for the case of forceful pressing are taken to be maximum values and are registered in the table, associated with the respective fingers. From these minimum values and maximum values, average values are determined, and in addition error correction values are determined, and the average values and error correction values are registered in the table, associated with the respective fingers.
  • the table may for example be registered in advance as follows. First, only the fingers to be registered are brought into contact with the operation portion 20 , one at a time and in order. Next, using a method of contact operation which is frequently performed by the specific user, the fingers to be registered are brought into contact simultaneously with the operation portion 20 .
  • each contact operation two cases are performed which are the case of light touching and the case of forceful pressing; the average values of the pressure intensity and contact area in each contact operation for the case of light touching are taken to be minimum values and are registered in the table, associated with the respective fingers, and the average values of the pressure intensity and contact area in each contact operation for the case of forceful pressing are taken to be maximum values and are registered in the table, associated with the respective fingers. From these minimum values and maximum values, average values are determined, and in addition error correction values are determined, and the average values and error correction values are registered in the table, associated with the respective fingers.
  • the hand/finger identification portion 35 B identifies the hand/finger of the contact operation as the palm of the hand; when the contact area is 150 mm 2 or less, the hand/finger identification portion 35 B identifies the finger of the contact operation as the little finger. And, for other fingers, the hand/finger identification portion 35 B identifies the finger of a contact operation from the average value of the contact area and the average value of the pressure intensity.
  • the hand/finger identification portion 35 B may take into consideration the structure of the fingers, and based on the position of the contact operation from the operation portion 20 , may identify the right or left hand from the positional relationship of the contact operation. For example, as shown in (a) of FIG. 15 , when there is a contact operation with a pressure intensity of 48 mm Hg above and to the left of the position of a contact operation with a pressure intensity of 55 mm Hg, the thumb and index finger of the left hand may be identified from the positional relationship, and when, as shown in (b) of FIG.
  • the index finger and middle finger of the right hand may be identified from the positional relationship.
  • the hand/finger identification portion 35 B may also identify left and right hands based on average values of contact area.
  • the range decision portion 31 A decides the contact operation range D of the contact region 20 a having as starting and ending points the two positions P 2 a , P 2 b respectively of the contact operation, and then decides the target range A 1 of the display region 10 a corresponding to the contact operation range D of the contact region 20 a .
  • the processing menu display portion 32 A causes the processing menu M to be displayed at the position relative to the target range A 1 associated with the combination of the hand and two fingers.
  • the processing menu display portion 32 A causes the processing menu M to be displayed at the display position corresponding to the combination of identified fingers, based on a table stored in ROM in advance which associates a plurality of patterns of combinations of fingers with a plurality of patterns of display positions of the processing menu M; but the processing menu display portion 32 A may assign priorities to the fingers, based on these priorities predict the optimum fingers to perform processing menu selection, and cause the processing menu M to be displayed.
  • the processing menu display portion 32 A may predict that a finger other than the fingers of a contact operation, which exists at a position not overlapping with the contact operation range D of the contact region 20 a , and which is the finger with the highest priority is the optimum finger for performing processing menu selection, and may cause the processing menu M to be displayed at a position in the display region 10 a corresponding to the predicted position of this finger.
  • the index finger overlaps with the contact operation range D of the contact region 20 a defined by the two positions P 2 a , P 2 b of the contact operation, so that the processing menu display portion 32 A predicts that the optimum finger for performing processing menu selection is the ring finger, and causes the processing menu M to be displayed at the position of the display region 10 a corresponding to the position of this ring finger.
  • the processing menu display portion 32 A predicts that the optimum finger for performing processing menu selection is the ring finger, and causes the processing menu M to be displayed at the position of the display region 10 a corresponding to the position of this ring finger.
  • the fingers at the three positions P 2 a , P 2 b , P 2 c of a contact operation are respectively the thumb, index finger, and palm of the right hand, there are no fingers overlapping the contact operation range D of the contact region 20 a defined by the three positions P 2 a , P 2 b , P 2 c of the contact operation, so that the processing menu display portion 32 A predicts that the optimum finger to perform processing menu selection is the middle finger, and causes the processing menu M to be displayed at a position of the display region 10 a corresponding to the position of this middle finger.
  • the processing menu display portion 32 A may predict the optimum finger to perform processing menu selection based on further priorities in the order of the favored hand and the unfavored hand. For example, when the fingers of two positions P 2 a , P 2 b of a contact operation are the thumb of the left hand and the thumb of the right hand respectively, as shown in FIG. 19 , the processing menu display portion 32 A predicts that the optimum finger for performing processing menu selection is the index finger of the favored hand (for example, the right hand), and so causes the processing menu M to be displayed at the position of the display region 10 a corresponding to the position of this index finger. At this time, the processing menu display portion 32 A causes the processing menu M to be displayed at a position of the display region 10 a corresponding to a position not overlapping with the contact operation range D of the contact region 20 a.
  • the finger other than the contact operation fingers with highest priority on the unfavored hand may be predicted to be the optimum finger for performing processing menu selection.
  • the fingers of the two positions P 2 a , P 2 b of a contact operation are respectively the index finger of the left hand and the index finger of the right hand, and the thumb of the favored hand (for example, the right hand) overlaps with the contact operation range D
  • the thumb of the unfavored hand for example, the left hand
  • processing menu display portion 32 A may predict the optimum finger to perform processing menu selection based on the positional relationship of the fingers of the contact operation and on the position of the contact operation range D of the contact region 20 a , and may cause display of the processing menu M. For example, when as shown in FIG.
  • the processing menu display portion 32 A predicts that the optimum finger for performing processing menu selection is the index finger, which is positioned between the position P 2 a (middle finger) and the position P 2 b (thumb), and causes the processing menu M to be displayed at the position of the display region 10 a corresponding to the position of this index finger.
  • the processing menu display portion 32 A causes the processing menu M to be displayed at the position of the display region 10 a corresponding to a position which does not overlap with the contact operation range D of the contact region 20 a . Also, when for example as shown in FIG.
  • the processing menu display portion 32 A predicts that the optimum finger for performing processing menu selection is the middle finger, positioned between the position P 2 a (index finger) and the position P 2 b (ring finger), and causes the processing menu M to be displayed at the position of the display region 10 a corresponding to the position of this middle finger.
  • the processing menu display portion 32 A causes the processing menu M to be displayed at the position of the display region 10 a corresponding to a position which does not overlap with the contact operation range D of the contact region 20 a.
  • the processing menu display portion 32 A predicts that the optimum finger for performing processing menu selection is the middle finger, positioned between the position P 2 b (index finger) and the position P 2 c (ring finger), and causes the processing menu M to be displayed at the position of the display region 10 a corresponding to the position of this middle finger.
  • the processing menu display portion 32 A causes the processing menu M to be displayed at the position of the display region 10 a corresponding to a position which does not overlap with the contact operation range D of the contact region 20 a.
  • FIG. 24 is a block diagram of an information processing device employing a fingerprint sensor
  • FIG. 25 is a perspective view showing the configuration of the information processing device shown in FIG. 24
  • the information processing device 1 C comprises a fingerprint sensor 40 C and control portion 30 C in place of the pressure-sensitive sensor 40 B and control portion 30 B in the information processing device 1 B.
  • the fingerprint sensor 40 C is a sheet-shape fingerprint sensor, and as shown in FIG. 25 , is arranged below the operation portion 20 , that is, below the electrostatic pad.
  • the fingerprint sensor detects the fingerprints of the fingers of the user performing contact operations through the electrostatic pad, and outputs the detected fingerprints to the control portion 30 C.
  • the control portion 30 C differs from the control portion 30 B in comprising a hand/finger identification portion 35 C in place of the hand/finger identification portion 35 B in the control portion 30 B.
  • the hand/finger identification portion 35 C identifies the hand of a contact operation as the right hand or left hand through fingerprints from the fingerprint sensor 40 C, and identifies the two fingers of the contact operation among the thumb, index finger, middle finger, ring finger, and little finger.
  • a table which associates the fingerprints of the fingers of the right and left hands with the fingers is stored in advance in ROM.
  • the hand/finger identification portion 35 C identifies the finger corresponding to a fingerprint from the fingerprint sensor 40 C as the finger of a contact operation based on this table. This table may for example be stored in advance by making measurements using a fingerprint sensor 41 .
  • the display size and position of the processing menu are fixed; however, through dragging operations using the user's fingers or similar, the display size of the processing menu may be made modifiable, or it may be possible to move the processing menu.
  • the range decision processing to decide the target range is performed by the range decision portions 31 and 31 A independently of the timing of contact operations for the two points; but the target range may be decided only when the range decision portions 31 , 31 A receive the position signals for two points from the operation portion 20 simultaneously, that is, when the two-point contact operation is performed simultaneously.
  • the target range may be decided only when the range decision portions 31 , 31 A receive the position signals for two points from the operation portion 20 simultaneously, that is, when the two-point contact operation is performed simultaneously.
  • an electrostatic pad was described as an example of the operation portion 20 ; but a touch panel, optical sensors, or a wide variety of other devices capable of detecting contact operations can be employed as the operation portion 20 .

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)
  • Input From Keyboards Or The Like (AREA)

Abstract

An information processing device of an aspect of the invention has display means having a display region; operation means, having a contact region, for outputting a position signal indicating the position of a contact operation in the contact region; range decision means for deciding a target range of the display region based on the position signals for two points from the operation means; processing menu display means for causing the display means to display a processing menu at a position corresponding to the target range of the display region; processing decision means for deciding the processing of the processing menu corresponding to the position of the contact operation in the contact region; and processing execution means for executing the processing on the target range of the display region.

Description

BACKGROUND OF THE INVENTION
1. Field of the Invention
This invention relates to an information processing device, and to a display information editing method of such an information processing device.
2. Related Background Art
Portable telephone terminals, PDAs (Personal Digital Assistants), and other information processing devices comprise a keyboard, mouse, operation buttons, or other operation portions, and a display or other display portion, and can perform text/image information processing in which text information is displayed on a display portion according to information input by an operation portion, or text information and image information or similar, stored in advance, are displayed on a display portion according to operation by an operation portion.
In Japanese Patent Application Laid-open No. 2000-311040, an information processing device of this type is described. This information processing device can be used to specify a display information range by operating a cursor on the display portion by means of a mouse, and then, with the cursor within the specified range, performing a clicking operation with the mouse to perform copying of the display information. Using this information processing device, only a range specification operation and a single click operation are performed, and the user need only move the mouse slightly between these two operations, so that the ease of operation of display information copying can be improved. Further, in Japanese Patent Application Laid-open No. 2000-311040, the possibility of application of a touch panel as the operation portion of the information processing device is described.
However, in addition to copying, there are demands that an information processing device have various other editing functions, such as for enlargement/reduction and color modification. However, the information processing device described in Japanese Patent Application Laid-open No. 2000-311040 can perform only copying, and cannot satisfy such demands.
SUMMARY OF THE INVENTION
Hence an object of this invention is to provide an information processing device and a display information editing method of an information processing device enabling improved ease of operation for a plurality of display information editing functions.
An information processing device of this invention comprises (a) display means for displaying information; (b) operation means, having a contact region corresponding to a display region of the display means, for detecting a contact operation in the contact region and outputting a position signal indicating the position of the contact operation in the contact region; (c) range decision means for deciding a target range of the display region, based on position signals for two points from the operation means; (d) processing menu display means for causing the display means to display a processing menu for selection of various processing, at a position corresponding to the target range of the display region decided by the range decision means; (e) processing decision means for deciding processing of the processing menu corresponding to the position of contact operation in the contact region, based on the position signal for a third point from the operation means; and (f) processing execution means for executing the processing decided by the processing decision means on the target range of the display region decided by the range decision means.
A display information editing method of an information processing device of this invention is a display information editing method of an information processing device which has display means for displaying information and operation means having a contact region corresponding to a display region of the display means, in which (a) a contact operation in the contact region is detected, and a position signal corresponding to the position of the contact operation in the contact region is generated; (b) a target range of the display region is decided based on the position signals for two points; (c) the display means is caused to display a processing menu for selection of various processing, at a position corresponding to the decided target range of the display region; (d) processing of the processing menu corresponding to the position of contact operation in the contact region is decided, based on the position signal for a third point; and (e) the decided processing is executed on the decided target range of the display region.
By means of this information processing device and display information editing method of an information processing device, processing menu display means causes display means to display a processing menu at a position corresponding to a display region target range decided by range decision means in order to select various processing, so that the user need only perform a range specification operation using two fingers and a processing specification operation using a third finger. Hence the ease of operation of a plurality of display information editing functions can be improved.
It is preferable that the above-described processing menu display means cause the display means to display the processing menu at a position set in advance for the target range of the display region.
The ease of operation is different for the right hand and for the left hand. And, the ease of operation is different depending on the two fingers used for range specification and the third finger used for processing specification. For example, when using the little finger and index finger of the left hand to perform range specification, and the thumb of the left hand to perform processing specification, it is preferable that the processing menu be displayed in the center of the portion below the specified range, or displayed to the right thereof. And, when using the little finger and thumb of the right hand to perform range specification, and the index finger of the right hand to perform processing specification, it is preferable that the processing menu M be displayed in the center of the portion above the specified range. By this means, the display position of the processing menu corresponding to the target range of the display region can be set in advance according to user preferences, so that ease of operation can be further improved. And, the user can perform operations with one hand, so that ease of operation can be improved.
It is preferable that the above-described information processing device further comprise identification means for identifying whether a hand performing the contact operation in the contact region is the right hand or the left hand, and for identifying two fingers of the contact operation, and that the above-described processing menu display means cause the display means to display the processing menu at a position relative to the target range of the display region associated with the combination of the hand and the two fingers, based on the hand and the two fingers identified by the identification means.
By this means, whether range specification was performed by the right hand or the left hand is identified by the identification means, and the two fingers of the range specification are identified; for example, when range specification is performed by the little finger and index finger of the left hand, by displaying the processing menu on the right side of the portion below the specified range, the ease of operation by the thumb of the right hand can be improved. And, when for example range specification is performed by the little finger and thumb of the right hand, by displaying the processing menu in the center of the portion above the specified range, ease of operation by the index finger of the right hand can be improved. Hence ease of operation can be improved without depending on the hand and fingers used for contact operations. Also, a user can perform operations with one hand, so that ease of operation can be further improved.
It is preferable that when the position of the contact operation indicated by the position signals for two points from the operation means changes, the above-described range decision means re-decides the target range of the display region.
By this means, a specified range can be adjusted by sliding movement of the two fingers used in range specification, so that ease of operation can be further improved.
It is preferable that the above-described range decision means decide the target range of the display region only when position signals for two points are received simultaneously from the operation means.
By this means, for example, an information input operation due to single-point contact and a range specification operation due to simultaneous two-point contact can be discriminated, so that display information editing can be caused to be performed even during an information input operation.
By means of this invention, the ease of operation of an information processing device with respect to a plurality of display information editing functions can be improved.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a perspective view showing the configuration of the information processing device of a first aspect of the invention;
FIG. 2 is a block diagram showing the configuration of the information processing device of the first aspect of the invention;
FIG. 3 shows range decision processing by the range decision portion of FIG. 2;
FIG. 4 shows processing menu display processing by the processing menu display portion, range adjustment processing by the range decision portion, and editing processing decision processing by the processing decision portion of FIG. 2;
FIG. 5 shows editing processing execution processing by the processing execution portion of FIG. 2;
FIG. 6 is a flowchart showing the display information editing method of the information processing device of the first aspect of the invention;
FIG. 7 is a block diagram showing the configuration of the information processing device of a second aspect of the invention;
FIG. 8 shows the range decision processing of the range decision portion of FIG. 2;
FIG. 9 shows the processing menu display processing of the processing menu display portion, the range adjustment processing of the range decision portion, and the editing processing decision processing of the processing decision portion of FIG. 2;
FIG. 10 shows the editing processing execution processing of the processing execution portion of FIG. 2;
FIG. 11 is a flowchart showing the display information editing method of the information processing device of the second aspect of the invention;
FIG. 12 is a block diagram showing the configuration of the information processing device of a modified example of the invention;
FIG. 13 is a perspective view showing, partially exploded, the configuration of the information processing device of a modified example of the invention;
FIG. 14 shows a table in which the pressure intensities and contact areas of the fingers of the right and left hands are associated with the fingers of the hands;
FIG. 15 shows the positional relationship of contact operations;
FIG. 16 shows the distribution of the contact area and pressure intensity of contact operations;
FIG. 17 shows an example of hand and finger identification processing and processing menu display processing in a modified example of the invention;
FIG. 18 shows an example of hand and finger identification processing and processing menu display processing in a modified example of the invention;
FIG. 19 shows an example of hand and finger identification processing and processing menu display processing in a modified example of the invention;
FIG. 20 shows an example of hand and finger identification processing and processing menu display processing in a modified example of the invention;
FIG. 21 shows an example of hand and finger identification processing and processing menu display processing in a modified example of the invention;
FIG. 22 shows an example of hand and finger identification processing and processing menu display processing in a modified example of the invention;
FIG. 23 shows an example of hand and finger identification processing and processing menu display processing in a modified example of the invention;
FIG. 24 is a block diagram showing the configuration of the information processing device of a modified example of the invention; and
FIG. 25 is a perspective view showing, partially exploded, the configuration of the information processing device of a modified example of the invention.
DESCRIPTION OF THE PREFERRED EMBODIMENTS
Below, preferred aspects of the invention are explained in detail, referring to the drawings. In the figures, portions which are the same or equivalent are assigned the same symbols.
(First Aspect)
FIG. 1 is a perspective view showing the configuration of the information processing device of a first aspect of the invention, and FIG. 2 is a block diagram showing the configuration of the information processing device of the first aspect of the invention. A portable telephone terminal is shown as an example of an information processing device in FIG. 1 and FIG. 2. The information processing device 1 of FIG. 1 and FIG. 2 comprises a display portion 10, operation portion 20, and control portion 30. In FIG. 1, the control portion 30 is accommodated within a housing which accommodates the display portion 10, or within a housing which accommodates the operation portion 20.
The display portion 10 is for example a liquid crystal display. A plurality of pixels are arranged in row and column directions in the display portion 10, and the region in which these pixels are arranged is the display region 10 a. The display portion 10 receives text data, image data, and other information from the control portion 30, and displays this information in the display region 10 a. For example, based on information from the control portion 30, the display portion 10 causes pixels corresponding to the row and column coordinates (addresses) representing this information to emit light in colors representing the information.
The operation portion 20 is for example an electrostatic pad, having a column electrode layer, row electrode layer, and a dielectric layer arranged between the column electrode layer and the row electrode layer. In the column electrode layer, a plurality of column electrodes are arranged in parallel in the row direction extending in the column direction, and in the row electrode layer, a plurality of row electrodes are arranged in parallel in the column direction extending in the row direction. The region in which these column electrodes and row electrodes are arranged is the contact region 20 a. The operation portion 20 detects contact operations in the contact region 20 a by the user, and outputs position signals to the control portion 30 indicating the positions of contact operations in the contact region 20 a.
Specifically, in the operation portion 20, portions at which column electrodes and row electrodes intersect each comprise a capacitor, and through the change in the electrostatic capacitance of each capacitor due to a contact operation by the user, the potentials on the column electrodes and row electrodes of the capacitors change. In this way, the operation portion 20 detects changes in the electrostatic capacitances of capacitors, that is, changes in the potentials of column electrodes and row electrodes, and detects contact operations by the user on the contact region 20 a. And, the operation portion 20 outputs to the control portion 30, as position signals, the row and column coordinates of portions of intersection of the column electrodes and row electrodes the potentials of which have changed.
The positions of intersection of column coordinates and row coordinates in the contact region 20 a of the operation portion 20 are associated with pixel positions in the display region 10 a of the display portion 10. The positions of intersection of row electrodes and column electrodes, and the positions of pixels, need not necessarily be associated in a one-to-one relationship.
The control portion 30 has ROM (Read-Only Memory), which stores various programs and various information (transmission/reception mail log character information, image information, mail address information, and similar), a CPU (Central Processing Unit) which executes various program stored in the ROM, and RAM (Random Access Memory) which temporarily stores information, or is used as a working area for execution of various programs. By means of this configuration, the control portion 30 functions as a range decision portion 31, processing menu display portion 32, processing decision portion 33, and processing execution portion 34.
Below, the range decision portion 31, processing menu display portion 32, processing decision portion 33, and processing execution portion 34 are explained using FIG. 3 to FIG. 5. FIG. 3 to FIG. 5 show in sequence the processing processes of range decision processing and range adjustment processing by the range decision portion 31, processing menu display processing by the processing menu display portion 32, editing processing decision processing by the processing decision portion 33, and editing processing execution processing by the processing execution portion 34.
The range decision portion 31 decides the target range A1 of the display region 10 a based on position signals for two points from the operation portion 20. In this aspect, the range decision portion 31 decides the target range A1 of the display region 10 a which takes as the starting point and ending point the two positions P1 a, P1 b respectively of the display region 10 a corresponding to the two positions P2 a, P2 b of the contact operation in the contact region 20 a.
For example, in the ROM, the positions of intersection of column electrodes and row electrodes in the contact region 20 a of the operation portion 20 and pixels in the display region 10 a of the display portion 10 are stored in advance in association as a table. Based on this table, the range decision portion 31 determines the two positions P1 a, P1 b of the display region 10 a corresponding to the two positions P2 a, P2 b of the contact operation in the contact region 20 a. And, the range decision portion 31 decides, as the target range A1 of the display region 10 a, the range which takes the two positions P1 a, P1 b of the display region 10 a thus determined as the starting point and ending point respectively.
The range decision portion 31 performs adjustment of the decided target range A1. For example, the range decision portion 31 causes the display portion 10 to display the border B indicating the target range A1 after target range decision, as shown in FIG. 4, and the operation portion 20 detects contact operations and sliding in the portion of the contact region 20 a corresponding to this border B. The range decision portion 31 receives position signals corresponding to sliding, and again determines the target range. And, the range decision portion 31 re-decides the range determined again as the target range A2 when a processing specification operation, described below, is performed. Specifically, as explained below, the range decision portion 31 finalizes the target range A2 when the position N2 of the contact region 20 a corresponding to the positions of the processing N1 of the processing menu M in the display region 10 a is selected.
The processing menu display portion 32 causes the display portion 10 to display the processing menu M to enable selection of various processing. For example, display positions for the processing menu M corresponding to target ranges A1 of the display region 10 a are stored in advance in the ROM. The display position of the processing menu M can be set by the user, for example during initial settings or at other times. In this aspect, range specification is performed with the little finger and the index finger of the left hand, and each time the thumb is used to perform processing specification, the display position of the processing menu M is set on the right side in the portion below the target range A1.
The processing decision portion 33 decides the processing N1 in the processing menu M corresponding to the position P2 c of the contact operation in the contact region 20 a, based on the position signal for a third point from the operation portion 20.
The processing execution portion 34 executes the processing N1 decided by the processing decision portion 33 on the target range A2 of the display region 10 a decided by the range decision portion 31. In this aspect, the processing execution portion 34 performs enlargement editing of the characters displayed in the target range A2, as shown in FIG. 5.
Next, operation of the information processing device 1 of the first aspect is explained, and the display information editing method of the information processing device of the first aspect of the invention is explained. FIG. 6 is a flowchart showing the display information editing method of the information processing device of the first aspect of the invention.
First, the display position of the processing menu M relative to the target range of the display region 10 a is set by the user. In this aspect, the information processing device 1, in response to a command input by the user, sets the display position of the processing menu M on the right side of the portion below the target range, and stores the display position thus set for the processing menu M in ROM as for example coordinate data (step S01).
Then, as shown in FIG. 3, when the user touches the two points P2 a, P2 b with the little finger and index finger, the operation portion 20 detects changes in the potentials of the column electrodes and row electrodes due to the contact of the fingers with the contact region 20 a, and outputs to the control portion 30, as position signals, the column and row coordinates of the intersecting portions of the column electrodes and row electrodes for which potentials have changed (step S02).
Next, the range decision portion 31 determines the two positions P1 a, P1 b of the display region 10 a corresponding to the two positions P2 a, P2 b of the contact operation in the contact region 20 a, based on the table stored in advance in ROM (step S03).
Next, the range decision portion 31 decides the target range A1 of the display region 10 a having as the starting point and ending point the previously determined two positions P1 a, P1 b, respectively, of the display region 10 a (step S04).
Next, as shown in FIG. 4, the processing menu display portion 32 causes the processing menu M to be displayed at the display position of the processing menu M corresponding to the target range A1 of the display region 10 a, stored in advance in ROM. In this aspect, the processing menu display portion 32 causes display of the processing menu M on the right side in the portion below the target range A1 (step S03).
Next, the range decision portion 31 causes the display portion 10 to display, using a border B, the target range A1 of the display region 10 a. When the user touches the portion of the contact region 20 a of the operation portion 20 corresponding to the border B of the display region 10 a of the display portion 10, and slides his fingers, the operation portion 20 detects the finger sliding operation. The range decision portion 31 receives position signals corresponding to the sliding operation, and re-determines the target range of the display region 10 a. In this way, the range decision portion 31 adjusts the target range A1 of the display region 10 a (step S06).
Thereafter, upon receiving the position signal for a third point from the operation portion 20, the range decision portion 31 re-decides the range of the display region 10 a after adjustment as the target range A2 (step S07).
Based on the position signal for the third point from the operation portion 20, the processing decision portion 33 decides on the processing “enlarge/reduce” N1 in the processing menu M corresponding to the position P2 c of the contact operation in the contact region 20 a (step S08).
Next, the processing execution portion 34 executes the processing N1, decided on by the processing decision portion 33, on the target range A2 of the display region 10 a decided by the range decision portion 31. In this aspect, the processing execution portion 34 performs enlargement processing of the characters displayed in the target range A2, as shown in FIG. 5 (step S09).
In this way, by means of the display processing device 1 of the first aspect and the display information editing method of an information processing device of the first aspect, the processing menu display portion 32 causes the display portion 10 to display a processing menu M for selection of various processing functions at a position corresponding to the target range A1 of the display region 10 a, decided by the range decision portion 31, and so the user need only perform a range specification operation using two fingers and a processing specification operation using a third finger. Hence the ease of operation with respect to a plurality of display information editing functions can be improved.
However, ease of operation differs for the right and for the left hands. And, ease of operation is also different depending on differences in the two fingers used for range specification and the third finger used for processing specification. When, as in this aspect, range specification is performed using the little finger and index finger of the left hand, and processing specification is performed using the thumb of the left hand, it is preferable that the processing menu be displayed in the center or towards the right in the portion below the specified range.
By means of the information processing device 1 of the first aspect and the display information editing method of an information processing device of the first aspect, the display position of the processing menu M can be set in advance according to user preferences, so that ease of operation can be further improved. Also, the user can perform operations using one hand, so that ease of operation can be further improved.
And, by means of the information processing device 1 of the first aspect and the display information editing method of an information processing device of the first aspect, when the positions of a contact operation indicated by the position signals for two points from the operation portion 20 have changed, the target range A2 of the display region 10 a can be re-decided by the range decision portion 31, so that by sliding and moving the two fingers used for range specification, the specified range can be adjusted, and so ease of operation can be further improved.
(Second Aspect)
FIG. 7 is a block diagram showing the configuration of the information processing device of a second aspect of the invention. The configuration of the information processing device 1A shown in FIG. 7 differs from that of the first aspect in further comprising an image capture portion 40 in the information processing device 1, and in comprising, in place of the control portion 30, a control portion 30A. Otherwise the configuration of the information processing device 1A is the same as that of the information processing device 1.
The image capture portion 40 is for example a camera, which captures images of the hand and fingers of the user performing contact operations in the contact portion 20, and outputs captured image signals to the control portion 30A.
The control portion 30A differs from the configuration in the first aspect in further having a hand/finger identification portion 35 in the control portion 30, and in having, in place of the range decision portion 31 and processing menu display portion 32, a range decision portion 31A and processing menu display portion 32A. Otherwise the configuration of the control portion 30A is the same as that of the control portion 30.
Below, the hand/finger identification portion 35, range decision portion 31A, and processing menu display portion 32A are explained using FIG. 8 to FIG. 10. FIG. 8 to FIG. 10 show in sequence the processing processes of range decision processing and range adjustment processing by the range decision portion 31A, processing menu display processing by the processing menu display portion 32A, editing processing decision processing by the above-described processing decision portion 33, and editing processing execution processing by the above-described processing execution portion 34.
The hand/finger identification portion 35 identifies the hand and the two fingers performing the contact operation in the contact region 20 a. For example, by performing image processing of image signals from the image capture portion 40, the hand/finger identification portion 35 identifies the hand of the contact operation as the right hand or the left hand, and identifies the two fingers of the contact operation among the thumb, index finger, middle finger, ring finger, and little finger.
As shown in FIG. 8, the range decision portion 31A differs from the range decision portion 31 in deciding the contact operation range D of the contact region 20 a having as starting and ending points the two positions P2 a, P2 b respectively of the contact operation, based on the position signals for two points from the operation portion 20, and thereafter, deciding the target range A1 of the display region 10 a corresponding to the contact operation range D of the contact region 20 a.
For example, contact operation ranges D are stored in advance in ROM in a plurality of conceivable patterns in the contact region 20 a of the operation portion 20, and a plurality of patterns of target ranges A1 in the display region 10 a of the display portion 10 associated with the respective contact operation ranges D are stored in advance in ROM. That is, a plurality of patterns of contact operation ranges D and a plurality of patterns of target ranges A1 are stored in advance in association, as a table, in ROM. Based on this table, the range decision portion 31A decides the target range A1 corresponding to a contact operation range D.
As shown in FIG. 9, the range decision portion 31A performs adjustment of a decided target range A1 and re-decides a target range A2 after adjustment, similarly to the range decision portion 31.
Based on the hand and two fingers identified by the hand/finger identification portion 35, the processing menu display portion 32A causes display of the processing menu M at a position relative to the target range A1 associated with the combination of the hand and two fingers. For example, a plurality of conceivable patterns of combinations of a hand and two fingers are stored in advance in ROM, and a plurality of patterns of processing menu M display positions are stored in association with each of the combinations of a hand and two fingers. That is, a plurality of patterns of combinations of a hand and two fingers, and a plurality of patterns of processing menu M display positions, are stored in association as a table in ROM. Based on this table, the processing menu display portion 32A causes the processing menu M to be displayed at the display position associated with the combination of the identified hand and two fingers.
The processing decision portion 33 and processing execution portion 34 are similar to those of the first aspect, and so explanations are omitted.
Next, operation of the information processing device 1A of the second aspect is explained, and the display control method of the information processing device of the second aspect of the invention is explained. FIG. 11 is a flowchart showing the display control method of the information processing device of the second aspect of the invention.
First, as shown in FIG. 8, when the user touches the two points P2 a, P2 b with the thumb and little finger of the right hand, the processing of the above-described step S02 is performed. And, the image capture portion 40 captures an image of the user's hand and fingers, and outputs image signals to the control portion 30A. Then, by performing image processing of the image signals from the image capture portion 40, the hand/finger identification portion 35 identifies the hand and two fingers used in the contact operation (step S10).
Next, based on the position signals for the two points from the operation portion 20, the range decision portion 31A decides the contact operation range D of the contact region 20 a having as starting and ending points the two positions P2 a, P2 b respectively of the contact operation (step S03A).
Next, based on the table stored in advance in ROM, the range decision portion 31A decides the target range A1 of the display region 10 a corresponding to the contact operation range D of the contact region 20 a (step S04A).
Next, as shown in FIG. 9, the processing menu display portion 32A causes the processing menu M to be displayed at the position relative to the target range A1 associated with the combination of the hand and two fingers identified by the hand/finger identification portion 35. In this aspect, the processing menu display portion 32A causes the processing menu M to be displaced in the center of the portion above the target range A1 corresponding to the combination of the thumb and little finger of the right hand (step S05A). Then, the processing of the above-described steps S06 to S09 is performed, and enlargement editing of the display image is performed, as shown in FIG. 10.
Similar advantages can be obtained from both the information processing device 1A of the second aspect and from the display control method of an information processing device of the second aspect as are obtained from the information processing device 1 of the first aspect and from the display control method of an information processing device of the first aspect.
Further, by means of the information processing device 1A of the second aspect and the display control method of an information processing device of the second aspect, the hand and two fingers used in range specification are identified by the hand/finger identification portion 35, so that when for example range specification has been performed using the little finger and thumb of the right hand, by displaying the processing menu in the center of the portion above the specified range, ease of operation by the index finger of the right hand can be improved. Hence ease of operation can be improved without depending on the hand and fingers used for contact operations. Further, the user can perform operations with one hand, so that ease of operation can be further improved.
This invention is not limited to the above-described aspects, and various modifications are possible. For example, in these aspects, examples of a portable telephone terminal as the information processing device were described; but the information processing device may be a PDA or a wide range of other devices capable of information processing.
Further, in this aspect, an example of image processing using a camera was described as an identification method for identifying the hand and two fingers of a contact operation; but the identification method is not limited to that of this aspect. For example, an identification method may be employed in which a pressure-sensitive sensor, touch sensor, or similar is used in identification using contact surface pressure.
FIG. 12 is a block diagram of an information processing device using a pressure sensor as a pressure-sensitive sensor; FIG. 13 is a perspective view showing, partially exploded, the configuration of the information processing device of FIG. 12. The configuration of the information processing device 1B of FIG. 12 and FIG. 13 differs from that of the information processing device 1A in that, in place of the image capture portion 40 and control portion 30A in the information processing device 1A, a pressure-sensitive sensor 40B and control portion 30B are comprised. For example, the pressure-sensitive sensor 40B is a sheet-shape pressure sensor, and as shown in FIG. 13, is arranged below the operation portion 20, that is, below the electrostatic pad. The pressure sensor detects pressure intensity of fingers of the user performing a contact operation through the electrostatic pad, and outputs the detected pressure intensity to the control portion 30B.
The control portion 30B differs from the control portion 30A in that, in place of the hand/finger identification portion 35 in the control portion 30A, a hand/finger identification portion 35B is comprised. The hand/finger identification portion 35B identifies whether the hand performing a contact operation is the right hand or the left hand, and further identifies the two fingers of a contact operation among the thumb, index finger, middle finger, ring finger, and little finger, from the pressure intensity from the pressure-sensitive sensor 40B, and through the contact area from the electrostatic sensor in the operation portion 20.
In general, the intensity of the force and the contact area of fingers performing contact operations are greater for the favored hand, right or left, and are larger in the order of the thumb, index finger, middle finger, ring finger, and little finger. Hence from the pressure intensity and contact area of the fingers of a contact operation, the hand and fingers of the contact operation can be identified. For example, a table which associates the pressure intensities and contact areas of the fingers of the right and left hands with the fingers of the hands is stored in advance in ROM, as shown in FIG. 14. Based on this table, the hand/finger identification portion 35B identifies the hand and fingers corresponding to the pressure intensities from the pressure-sensitive sensor 40B and the contact area from the electrostatic sensor in the operation portion 20 as the hand and fingers of a contact operation.
This table is for example registered in advance as follows. First, one finger at a time, in order, of the fingers on the right hand and left hand is brought into contact with the operation portion 20. Next, the five fingers of the right hand are brought into contact simultaneously with the operation portion 20, and the five fingers of the left hand are brought into contact simultaneously with the operation portion 20. Then, the five fingers of the right hand and the entire palm of the right hand are brought into contact simultaneously with the operation portion 20, and the five fingers and entire palm of the left hand are brought into contact simultaneously with the operation portion 20. For example, in each contact operation, two cases are performed which are the case of light touching and the case of forceful pressing; the average values of the pressure intensity and contact area in each contact operation for the case of light touching are taken to be minimum values and are registered in the table, associated with the respective fingers, and the average values of the pressure intensity and contact area in each contact operation for the case of forceful pressing are taken to be maximum values and are registered in the table, associated with the respective fingers. From these minimum values and maximum values, average values are determined, and in addition error correction values are determined, and the average values and error correction values are registered in the table, associated with the respective fingers.
When a specific user performs a contact operation using specific fingers, the table may for example be registered in advance as follows. First, only the fingers to be registered are brought into contact with the operation portion 20, one at a time and in order. Next, using a method of contact operation which is frequently performed by the specific user, the fingers to be registered are brought into contact simultaneously with the operation portion 20. For example, in each contact operation, two cases are performed which are the case of light touching and the case of forceful pressing; the average values of the pressure intensity and contact area in each contact operation for the case of light touching are taken to be minimum values and are registered in the table, associated with the respective fingers, and the average values of the pressure intensity and contact area in each contact operation for the case of forceful pressing are taken to be maximum values and are registered in the table, associated with the respective fingers. From these minimum values and maximum values, average values are determined, and in addition error correction values are determined, and the average values and error correction values are registered in the table, associated with the respective fingers.
According to FIG. 14, for example, when the contact area is for example 600 mm2 or greater, the hand/finger identification portion 35B identifies the hand/finger of the contact operation as the palm of the hand; when the contact area is 150 mm2 or less, the hand/finger identification portion 35B identifies the finger of the contact operation as the little finger. And, for other fingers, the hand/finger identification portion 35B identifies the finger of a contact operation from the average value of the contact area and the average value of the pressure intensity.
In a case in which the pressure intensity average value is the same for the right and left hands, the hand/finger identification portion 35B may take into consideration the structure of the fingers, and based on the position of the contact operation from the operation portion 20, may identify the right or left hand from the positional relationship of the contact operation. For example, as shown in (a) of FIG. 15, when there is a contact operation with a pressure intensity of 48 mm Hg above and to the left of the position of a contact operation with a pressure intensity of 55 mm Hg, the thumb and index finger of the left hand may be identified from the positional relationship, and when, as shown in (b) of FIG. 15, there is a contact operation with a pressure intensity of 48 mm Hg above and to the right of the position of a contact operation with a pressure intensity of 55 mm Hg, the index finger and middle finger of the right hand may be identified from the positional relationship. In order to enhance reliability, the hand/finger identification portion 35B may also identify left and right hands based on average values of contact area.
As shown in FIG. 16, the closer to the center of a finger which presses on the electrostatic pad and pressure sensor, the greater is the pressing force, so that by taking the position at which the pressing force is greatest to be the center, and plotting the range over which the pressing force is zero or greater, the contact area and the pressure intensity distribution can be ascertained.
Thereafter, based on the position signals for two points from the operation portion 20, the range decision portion 31A decides the contact operation range D of the contact region 20 a having as starting and ending points the two positions P2 a, P2 b respectively of the contact operation, and then decides the target range A1 of the display region 10 a corresponding to the contact operation range D of the contact region 20 a. Next, based on the hand and two fingers identified by the hand/finger identification portion 35, the processing menu display portion 32A causes the processing menu M to be displayed at the position relative to the target range A1 associated with the combination of the hand and two fingers.
In this aspect and modified example, the processing menu display portion 32A causes the processing menu M to be displayed at the display position corresponding to the combination of identified fingers, based on a table stored in ROM in advance which associates a plurality of patterns of combinations of fingers with a plurality of patterns of display positions of the processing menu M; but the processing menu display portion 32A may assign priorities to the fingers, based on these priorities predict the optimum fingers to perform processing menu selection, and cause the processing menu M to be displayed. For example, based on priorities assigned in the order of thumb, index finger, middle finger, ring finger, little finger, the processing menu display portion 32A may predict that a finger other than the fingers of a contact operation, which exists at a position not overlapping with the contact operation range D of the contact region 20 a, and which is the finger with the highest priority is the optimum finger for performing processing menu selection, and may cause the processing menu M to be displayed at a position in the display region 10 a corresponding to the predicted position of this finger.
For example, when as shown in FIG. 17 the fingers at the two positions P2 a and P2 b of a contact operation are the thumb and middle finger of the right hand respectively, the index finger overlaps with the contact operation range D of the contact region 20 a defined by the two positions P2 a, P2 b of the contact operation, so that the processing menu display portion 32A predicts that the optimum finger for performing processing menu selection is the ring finger, and causes the processing menu M to be displayed at the position of the display region 10 a corresponding to the position of this ring finger. And, when as for example shown in FIG. 18 the fingers at the three positions P2 a, P2 b, P2 c of a contact operation are respectively the thumb, index finger, and palm of the right hand, there are no fingers overlapping the contact operation range D of the contact region 20 a defined by the three positions P2 a, P2 b, P2 c of the contact operation, so that the processing menu display portion 32A predicts that the optimum finger to perform processing menu selection is the middle finger, and causes the processing menu M to be displayed at a position of the display region 10 a corresponding to the position of this middle finger.
Here, when the fingers for use in a contact operation are the fingers of both hands, the processing menu display portion 32A may predict the optimum finger to perform processing menu selection based on further priorities in the order of the favored hand and the unfavored hand. For example, when the fingers of two positions P2 a, P2 b of a contact operation are the thumb of the left hand and the thumb of the right hand respectively, as shown in FIG. 19, the processing menu display portion 32A predicts that the optimum finger for performing processing menu selection is the index finger of the favored hand (for example, the right hand), and so causes the processing menu M to be displayed at the position of the display region 10 a corresponding to the position of this index finger. At this time, the processing menu display portion 32A causes the processing menu M to be displayed at a position of the display region 10 a corresponding to a position not overlapping with the contact operation range D of the contact region 20 a.
When the predicted optimum finger, that is, the finger other than a contact operation finger which is the finger on the favored hand with highest priority, overlaps with the contact operation range D, the finger other than the contact operation fingers with highest priority on the unfavored hand may be predicted to be the optimum finger for performing processing menu selection. For example, when as shown in FIG. 20 the fingers of the two positions P2 a, P2 b of a contact operation are respectively the index finger of the left hand and the index finger of the right hand, and the thumb of the favored hand (for example, the right hand) overlaps with the contact operation range D, the thumb of the unfavored hand (for example, the left hand) may be predicted to be the optimum finger for performing processing menu selection.
Further, the processing menu display portion 32A may predict the optimum finger to perform processing menu selection based on the positional relationship of the fingers of the contact operation and on the position of the contact operation range D of the contact region 20 a, and may cause display of the processing menu M. For example, when as shown in FIG. 21 the fingers of the two positions P2 a, P2 b of a contact operation are the middle finger and thumb of the right hand respectively, and the positions P2 a, P2 b re near the center on the left side and near the center on the right side respectively, the processing menu display portion 32A predicts that the optimum finger for performing processing menu selection is the index finger, which is positioned between the position P2 a (middle finger) and the position P2 b (thumb), and causes the processing menu M to be displayed at the position of the display region 10 a corresponding to the position of this index finger. At this time, the processing menu display portion 32A causes the processing menu M to be displayed at the position of the display region 10 a corresponding to a position which does not overlap with the contact operation range D of the contact region 20 a. Also, when for example as shown in FIG. 22 the fingers of two positions P2 a, P2 b of a contact operation are the index finger and the ring finger of the right hand respectively, and the positions P2 a, P2 b are respectively in the lower left and lower right, the processing menu display portion 32A predicts that the optimum finger for performing processing menu selection is the middle finger, positioned between the position P2 a (index finger) and the position P2 b (ring finger), and causes the processing menu M to be displayed at the position of the display region 10 a corresponding to the position of this middle finger. At this time, the processing menu display portion 32A causes the processing menu M to be displayed at the position of the display region 10 a corresponding to a position which does not overlap with the contact operation range D of the contact region 20 a.
Here, when there are three or more contact operation fingers, so that for example as shown in FIG. 23 the fingers in three positions P2 a, P2 b, P2 c of a contact operation are respectively the thumb, index finger, and ring finger of the right hand, and the positions P2 a, P2 b, P2 c are respectively in the center-left, upper left, and upper right, the processing menu display portion 32A predicts that the optimum finger for performing processing menu selection is the middle finger, positioned between the position P2 b (index finger) and the position P2 c (ring finger), and causes the processing menu M to be displayed at the position of the display region 10 a corresponding to the position of this middle finger. At this time, the processing menu display portion 32A causes the processing menu M to be displayed at the position of the display region 10 a corresponding to a position which does not overlap with the contact operation range D of the contact region 20 a.
Thus in this modified example, by using an electrostatic sensor as the operation portion 20 and detecting the contact areas, and using a pressure sensor as a pressure-sensitive sensor 40B and detecting pressure intensities, the optimum finger for performing processing menu selection is predicted, and the processing menu M can be displayed at the optimum position, so that ease of operation by the user can be further improved.
In this modified example, an identification method which performed identification using a fingerprint sensor or similar may be applied as the identification method for identifying the hand and two fingers of a contact operation. FIG. 24 is a block diagram of an information processing device employing a fingerprint sensor, and FIG. 25 is a perspective view showing the configuration of the information processing device shown in FIG. 24. As shown in FIG. 24 and FIG. 25, the information processing device 1C comprises a fingerprint sensor 40C and control portion 30C in place of the pressure-sensitive sensor 40B and control portion 30B in the information processing device 1B. For example, the fingerprint sensor 40C is a sheet-shape fingerprint sensor, and as shown in FIG. 25, is arranged below the operation portion 20, that is, below the electrostatic pad. The fingerprint sensor detects the fingerprints of the fingers of the user performing contact operations through the electrostatic pad, and outputs the detected fingerprints to the control portion 30C.
The control portion 30C differs from the control portion 30B in comprising a hand/finger identification portion 35C in place of the hand/finger identification portion 35B in the control portion 30B. The hand/finger identification portion 35C identifies the hand of a contact operation as the right hand or left hand through fingerprints from the fingerprint sensor 40C, and identifies the two fingers of the contact operation among the thumb, index finger, middle finger, ring finger, and little finger. For example, a table which associates the fingerprints of the fingers of the right and left hands with the fingers is stored in advance in ROM. The hand/finger identification portion 35C identifies the finger corresponding to a fingerprint from the fingerprint sensor 40C as the finger of a contact operation based on this table. This table may for example be stored in advance by making measurements using a fingerprint sensor 41.
In this aspect, the display size and position of the processing menu are fixed; however, through dragging operations using the user's fingers or similar, the display size of the processing menu may be made modifiable, or it may be possible to move the processing menu.
Further, in these aspects the range decision processing to decide the target range is performed by the range decision portions 31 and 31A independently of the timing of contact operations for the two points; but the target range may be decided only when the range decision portions 31, 31A receive the position signals for two points from the operation portion 20 simultaneously, that is, when the two-point contact operation is performed simultaneously. By this means, for example, an information input operation performed by single-point contact, and a range specification operation performed by two-point simultaneous contact, can be discriminated, so that even during information input operations, display information editing processing can be caused to be performed.
In these aspects, an electrostatic pad was described as an example of the operation portion 20; but a touch panel, optical sensors, or a wide variety of other devices capable of detecting contact operations can be employed as the operation portion 20.

Claims (11)

1. An information processing device, comprising:
display means for displaying information;
operation means, having a contact region corresponding to a display region of the display means, for detecting a contact operation in the contact region and outputting a position signal indicating the position of the contact operation in the contact region;
range decision means for deciding a target range of the display region, based on the position signals for two points from the operation means;
processing menu display means for causing a processing menu to appear on the display means at a position corresponding to the target range of the display region decided by the range decision means, the processing menu receiving selection of various processing;
processing decision means for deciding processing of the processing menu corresponding to the position of the contact operation in the contact region, based on the position signal for a third point from the operation means; and
processing execution means for executing the processing decided by the processing decision means on the target range of the display region decided by the range decision means.
2. The information processing device according to claim 1, wherein the processing menu display means causes the display means to display the processing menu at a position set in advance for the target range of the display region.
3. The information processing device according to claim 1, further comprising identification means for identifying whether a hand performing the contact operation in the contact region is the right hand or the left hand, and for identifying two fingers of the contact operation, wherein the processing menu display means causes the display means to display the processing menu at a position relative to the target range of the display region associated with the combination of the hand and the two fingers, based on the hand and the two fingers identified by the identification means.
4. The information processing device according to claim 3, further comprising:
an identification table including a plurality of gestures and used by the identification means for identifying for identifying the hand performing the contact.
5. The information processing device according to claim 1, wherein, when the position of the contact operation indicated by the position signals for two points from the operation means changes, the range decision means re-decides the target range of the display region.
6. The information processing device according to claim 1, wherein the range decision means decides the target range of the display region only when the position signals for two points are received simultaneously from the operation means.
7. The information processing device according to claim 1, wherein the contact region is separate from the display region.
8. A display information editing method of an information processing device which has display means for displaying information and operation means having a contact region corresponding to a display region of the display means, comprising the steps of:
detecting a contact operation in the contact region, and generating a position signal indicating the position of the contact operation in the contact region;
deciding a target range of the display region based on the position signals for two points;
causing a processing menu to appear on the display means at a position corresponding to the decided target range of the display region, the processing menu receiving selection of various processing;
deciding processing of the processing menu corresponding to the position of the contact operation in the contact region, based on the position signal for a third point; and
executing the decided processing on the decided target range of the display region.
9. An information processing device, comprising:
a display to display information;
an operation unit, having a contact region for interacting with a display region of the display, to detect a contact operation in the contact region and outputting a position signal indicating the position of the contact operation in the contact region;
range decision unit to determine a target range of the display region, based on the position signals for two points from the operation unit;
processing menu display unit to cause a processing menu to appear on the display, at a position corresponding to the target range of the display region decided by the range decision unit, the processing menu receiving selection of various processing;
processing decision unit to determine processing of the processing menu corresponding to the position of the contact operation in the contact region, based on the position signal for a third point from the operation unit; and
processing execution unit to execute the processing decided by the processing decision unit on the target range of the display region decided by the range decision unit.
10. An information processing device, comprising:
display means for displaying information;
operation means, having a contact region corresponding to a display region of the display means, for detecting a contact operation in the contact region and outputting a position signal indicating the position of the contact operation in the contact region;
range decision means for deciding a target range of the display region, based on the position signals for two points from the operation means;
processing menu display means for causing the display means to display a processing menu for selection of various processing, at a position corresponding to the target range of the display region decided by the range decision means;
processing decision means for deciding processing of the processing menu corresponding to the position of the contact operation in the contact region, based on the position signal for a third point from the operation means; and
processing execution means for executing the processing decided by the processing decision means on the target range of the display region decided by the range decision means,
wherein the range decision means decides the target range of the display region, based on the position signals for the two points from the operation means and based upon a table associating coordinates of the contact region to coordinates of the display region.
11. The information processing device according to claim 10, wherein the processing menu is displayed in a lower right portion of the target range.
US12/463,693 2008-05-13 2009-05-11 Information processing device and display information editing method of information processing device Expired - Fee Related US8266529B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2008126304A JP5203797B2 (en) 2008-05-13 2008-05-13 Information processing apparatus and display information editing method for information processing apparatus
JP2008-126304 2008-05-13
JPP2008-126304 2008-05-13

Publications (2)

Publication Number Publication Date
US20090287999A1 US20090287999A1 (en) 2009-11-19
US8266529B2 true US8266529B2 (en) 2012-09-11

Family

ID=40718871

Family Applications (1)

Application Number Title Priority Date Filing Date
US12/463,693 Expired - Fee Related US8266529B2 (en) 2008-05-13 2009-05-11 Information processing device and display information editing method of information processing device

Country Status (5)

Country Link
US (1) US8266529B2 (en)
EP (1) EP2120131B1 (en)
JP (1) JP5203797B2 (en)
KR (1) KR101150321B1 (en)
CN (1) CN101582008B (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110141043A1 (en) * 2009-12-11 2011-06-16 Dassault Systemes Method and sytem for duplicating an object using a touch-sensitive display
US20130033717A1 (en) * 2011-08-03 2013-02-07 Sharp Kabushiki Kaisha Image forming apparatus, image editing method and non-transitory computer-readable recording medium
US9001063B2 (en) 2012-04-27 2015-04-07 Kabushiki Kaisha Toshiba Electronic apparatus, touch input control method, and storage medium
USD752048S1 (en) * 2013-01-29 2016-03-22 Aquifi, Inc. Display device with cameras
USD752585S1 (en) * 2013-01-29 2016-03-29 Aquifi, Inc. Display device with cameras
USD753655S1 (en) * 2013-01-29 2016-04-12 Aquifi, Inc Display device with cameras
USD753656S1 (en) * 2013-01-29 2016-04-12 Aquifi, Inc. Display device with cameras
USD753657S1 (en) * 2013-01-29 2016-04-12 Aquifi, Inc. Display device with cameras
USD753658S1 (en) * 2013-01-29 2016-04-12 Aquifi, Inc. Display device with cameras
US9557846B2 (en) 2012-10-04 2017-01-31 Corning Incorporated Pressure-sensing touch system utilizing optical and capacitive systems
US9830067B1 (en) 2010-11-18 2017-11-28 Google Inc. Control of display of content with dragging inputs on a touch input surface

Families Citing this family (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4719296B1 (en) * 2009-12-25 2011-07-06 株式会社東芝 Information processing apparatus and information processing method
JP5702540B2 (en) * 2010-02-18 2015-04-15 ローム株式会社 Touch panel input device
JP5702546B2 (en) * 2010-03-19 2015-04-15 ローム株式会社 Touch panel input device
US9250800B2 (en) 2010-02-18 2016-02-02 Rohm Co., Ltd. Touch-panel input device
JP2011197848A (en) * 2010-03-18 2011-10-06 Rohm Co Ltd Touch-panel input device
JP5502584B2 (en) * 2010-04-26 2014-05-28 ローム株式会社 Digital camera
JP2011227703A (en) * 2010-04-20 2011-11-10 Rohm Co Ltd Touch panel input device capable of two-point detection
JP5657269B2 (en) * 2010-04-26 2015-01-21 シャープ株式会社 Image processing apparatus, display apparatus, image processing method, image processing program, and recording medium
US20120030624A1 (en) * 2010-07-30 2012-02-02 Migos Charles J Device, Method, and Graphical User Interface for Displaying Menus
WO2012019350A1 (en) * 2010-08-12 2012-02-16 Google Inc. Finger identification on a touchscreen
CN102375652A (en) * 2010-08-16 2012-03-14 中国移动通信集团公司 Mobile terminal user interface regulation system and method
KR101763263B1 (en) * 2010-12-24 2017-07-31 삼성전자주식회사 3d display terminal apparatus and operating method
US8593421B2 (en) * 2011-03-22 2013-11-26 Adobe Systems Incorporated Local coordinate frame user interface for multitouch-enabled devices
JP2013041348A (en) * 2011-08-12 2013-02-28 Kyocera Corp Portable terminal, auxiliary information display program, and auxiliary information display method
WO2013044467A1 (en) * 2011-09-28 2013-04-04 宇龙计算机通信科技(深圳)有限公司 Terminal and menu display method
KR101880653B1 (en) * 2011-10-27 2018-08-20 삼성전자 주식회사 Device and method for determinating a touch input of terminal having a touch panel
CN104137038B (en) * 2012-01-09 2017-08-25 谷歌公司 The Intelligent touch screen keyboard differentiated with finger
US9563295B2 (en) * 2012-03-06 2017-02-07 Lenovo (Beijing) Co., Ltd. Method of identifying a to-be-identified object and an electronic device of the same
KR102040857B1 (en) * 2012-07-17 2019-11-06 삼성전자주식회사 Function Operation Method For Electronic Device including a Pen recognition panel And Electronic Device supporting the same
KR20140051719A (en) * 2012-10-23 2014-05-02 엘지전자 주식회사 Mobile terminal and control method thereof
CN103984495B (en) * 2013-02-07 2016-12-28 纬创资通股份有限公司 Operational approach and electronic installation
CN103164160A (en) * 2013-03-20 2013-06-19 华为技术有限公司 Left hand and right hand interaction device and method
JP6029638B2 (en) * 2014-02-12 2016-11-24 ソフトバンク株式会社 Character input device and character input program
KR101901234B1 (en) * 2018-01-18 2018-09-27 삼성전자 주식회사 Method and device for inputting of mobile terminal using a pen
KR102257614B1 (en) * 2020-03-31 2021-05-28 김은아 System and method for converting input signal from touch sensitive input device into variabl tactile effect

Citations (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06289984A (en) 1993-03-31 1994-10-18 Toshiba Corp Document preparing editing device
EP0622722A2 (en) 1993-04-30 1994-11-02 Rank Xerox Limited Interactive copying system
JP2000163031A (en) 1998-11-25 2000-06-16 Seiko Epson Corp Portable information equipment and information storage medium
JP2000311040A (en) 1998-10-19 2000-11-07 Toshihiko Okabe Device and method for data delivery and recording medium recording data delivery program
US20040150668A1 (en) 2003-01-31 2004-08-05 Xerox Corporation Secondary touch contextual sub-menu navigation for touch screen interface
US20050264540A1 (en) * 2004-06-01 2005-12-01 Souhei Niwa Data processing device, data processing method, and electronic device
US7002557B2 (en) * 2002-01-30 2006-02-21 Casio Computer Co., Ltd. Portable electronic apparatus and a display control method
US20060097991A1 (en) * 2004-05-06 2006-05-11 Apple Computer, Inc. Multipoint touchscreen
US20060110203A1 (en) * 2004-05-17 2006-05-25 Grafton Charlene H Dual numerical keyboard based on dominance
KR100672605B1 (en) 2006-03-30 2007-01-24 엘지전자 주식회사 Method for selecting items and terminal therefor
AU2007100826A4 (en) 2007-01-05 2007-09-27 Apple Inc. Multimedia communication device with touch screen responsive to gestures for controlling, manipulating, and editing of media files
US20070229474A1 (en) * 2006-03-29 2007-10-04 Yamaha Corporation Parameter editor and signal processor
US20070257890A1 (en) * 2006-05-02 2007-11-08 Apple Computer, Inc. Multipoint touch surface controller
KR100783553B1 (en) 2007-01-22 2007-12-07 삼성전자주식회사 Mobile device, method for generating group picture of phonebook in the same and method of executing communication event using the group picture
US20080309632A1 (en) * 2007-06-13 2008-12-18 Apple Inc. Pinch-throw and translation gestures
US20090138800A1 (en) * 2007-11-23 2009-05-28 Mckesson Financial Holdings Limited Apparatus, method and computer-readable storage medium for directing operation of a software application via a touch-sensitive surface
US7667692B2 (en) * 2003-10-31 2010-02-23 Zeemote, Inc. Human interface system
US7770136B2 (en) * 2007-01-24 2010-08-03 Microsoft Corporation Gesture recognition interactive feedback
US7883460B2 (en) * 2004-04-02 2011-02-08 Olympus Corporation Endoscope
US20110055703A1 (en) * 2009-09-03 2011-03-03 Niklas Lundback Spatial Apportioning of Audio in a Large Scale Multi-User, Multi-Touch System
US20110060986A1 (en) * 2009-09-10 2011-03-10 Chao-Kuang Yang Method for Controlling the Display of a Touch Screen, User Interface of the Touch Screen, and an Electronic Device using The Same
US20110057907A1 (en) * 2009-09-10 2011-03-10 Samsung Electronics Co., Ltd. Apparatus and method for determining user input pattern in portable terminal
US20110141043A1 (en) * 2009-12-11 2011-06-16 Dassault Systemes Method and sytem for duplicating an object using a touch-sensitive display

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101180599A (en) * 2005-03-28 2008-05-14 松下电器产业株式会社 User interface system

Patent Citations (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06289984A (en) 1993-03-31 1994-10-18 Toshiba Corp Document preparing editing device
EP0622722A2 (en) 1993-04-30 1994-11-02 Rank Xerox Limited Interactive copying system
JP2000311040A (en) 1998-10-19 2000-11-07 Toshihiko Okabe Device and method for data delivery and recording medium recording data delivery program
JP2000163031A (en) 1998-11-25 2000-06-16 Seiko Epson Corp Portable information equipment and information storage medium
US7002557B2 (en) * 2002-01-30 2006-02-21 Casio Computer Co., Ltd. Portable electronic apparatus and a display control method
US20040150668A1 (en) 2003-01-31 2004-08-05 Xerox Corporation Secondary touch contextual sub-menu navigation for touch screen interface
US7667692B2 (en) * 2003-10-31 2010-02-23 Zeemote, Inc. Human interface system
US7883460B2 (en) * 2004-04-02 2011-02-08 Olympus Corporation Endoscope
US7663607B2 (en) * 2004-05-06 2010-02-16 Apple Inc. Multipoint touchscreen
US20060097991A1 (en) * 2004-05-06 2006-05-11 Apple Computer, Inc. Multipoint touchscreen
US20060110203A1 (en) * 2004-05-17 2006-05-25 Grafton Charlene H Dual numerical keyboard based on dominance
US7705799B2 (en) * 2004-06-01 2010-04-27 Nec Corporation Data processing device, data processing method, and electronic device
US20050264540A1 (en) * 2004-06-01 2005-12-01 Souhei Niwa Data processing device, data processing method, and electronic device
US20070229474A1 (en) * 2006-03-29 2007-10-04 Yamaha Corporation Parameter editor and signal processor
KR100672605B1 (en) 2006-03-30 2007-01-24 엘지전자 주식회사 Method for selecting items and terminal therefor
US20070257890A1 (en) * 2006-05-02 2007-11-08 Apple Computer, Inc. Multipoint touch surface controller
AU2007100826A4 (en) 2007-01-05 2007-09-27 Apple Inc. Multimedia communication device with touch screen responsive to gestures for controlling, manipulating, and editing of media files
KR100783553B1 (en) 2007-01-22 2007-12-07 삼성전자주식회사 Mobile device, method for generating group picture of phonebook in the same and method of executing communication event using the group picture
US7770136B2 (en) * 2007-01-24 2010-08-03 Microsoft Corporation Gesture recognition interactive feedback
US20080309632A1 (en) * 2007-06-13 2008-12-18 Apple Inc. Pinch-throw and translation gestures
US20090138800A1 (en) * 2007-11-23 2009-05-28 Mckesson Financial Holdings Limited Apparatus, method and computer-readable storage medium for directing operation of a software application via a touch-sensitive surface
US20110055703A1 (en) * 2009-09-03 2011-03-03 Niklas Lundback Spatial Apportioning of Audio in a Large Scale Multi-User, Multi-Touch System
US20110060986A1 (en) * 2009-09-10 2011-03-10 Chao-Kuang Yang Method for Controlling the Display of a Touch Screen, User Interface of the Touch Screen, and an Electronic Device using The Same
US20110057907A1 (en) * 2009-09-10 2011-03-10 Samsung Electronics Co., Ltd. Apparatus and method for determining user input pattern in portable terminal
US20110141043A1 (en) * 2009-12-11 2011-06-16 Dassault Systemes Method and sytem for duplicating an object using a touch-sensitive display

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
European Search Report issued on Jun. 1, 2012, in counterpart European Application No. 09 15 9997 (7 pages).
Office Action issued Apr. 17, 2012, in Japanese Patent Application No. 2008-126304 filed May 13, 2008 (with English-language Translation).
Office Action issued Dec. 23, 2010, in Korean Patent Application No. 10-2009-0041764 (with English translation).
Office Action issued Sep. 30, 2011, in Korean Patent Application No. 10-2009-0041764 with English translation.
Rekimoto, Jun, "SmartSkin: An Infrastructure for Freehand Manipulation on Interactive Surfaces", CHI Conference on Human Factors in Computers, Minneapolis, Minnesota, Apr. 20-25, 2002, CHI 2002 Conference Proceedings, pp. 113-120; XP-001099406.

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110141043A1 (en) * 2009-12-11 2011-06-16 Dassault Systemes Method and sytem for duplicating an object using a touch-sensitive display
US8896549B2 (en) * 2009-12-11 2014-11-25 Dassault Systemes Method and system for duplicating an object using a touch-sensitive display
US9830067B1 (en) 2010-11-18 2017-11-28 Google Inc. Control of display of content with dragging inputs on a touch input surface
US10671268B2 (en) 2010-11-18 2020-06-02 Google Llc Orthogonal dragging on scroll bars
US11036382B2 (en) 2010-11-18 2021-06-15 Google Llc Control of display of content with dragging inputs on a touch input surface
US8934109B2 (en) * 2011-08-03 2015-01-13 Sharp Kabushiki Kaisha Image forming apparatus, image editing method and non-transitory computer-readable recording medium for forming an image on a recording medium based on an image displayed on a display section
US10701222B2 (en) * 2011-08-03 2020-06-30 Sharp Kabushiki Kaisha Image forming apparatus, image editing method and non-transitory computer-readable recording medium for forming an image on a recording medium based on an image displayed on a display section
US10051140B2 (en) 2011-08-03 2018-08-14 Sharp Kabushiki Kaisha Image editing method for modifying an object image with respect to a medium image
US20190268489A1 (en) * 2011-08-03 2019-08-29 Sharp Kabushiki Kaisha Image forming apparatus, image editing method and non-transitory computer-readable recording medium for forming an image on a recording medium based on an image displayed on a display section
US10341510B2 (en) 2011-08-03 2019-07-02 Sharp Kabushiki Kaisha Image forming apparatus, image editing method and non-transitory computer-readable recording medium for forming an image on a recording medium based on an image displayed on a display section
US9432533B2 (en) 2011-08-03 2016-08-30 Sharp Kabushiki Kaisha Image forming apparatus, image editing method and non-transitory computer-readable recording medium for forming an image on a recording medium based on an image displayed on a display section
US9692919B2 (en) * 2011-08-03 2017-06-27 Sharp Kabushiki Kaisha Image forming apparatus, image editing method and non-transitory computer-readable recording medium for forming an image on a recording medium based on an image displayed on a display section
US20130033717A1 (en) * 2011-08-03 2013-02-07 Sharp Kabushiki Kaisha Image forming apparatus, image editing method and non-transitory computer-readable recording medium
US9001063B2 (en) 2012-04-27 2015-04-07 Kabushiki Kaisha Toshiba Electronic apparatus, touch input control method, and storage medium
US9557846B2 (en) 2012-10-04 2017-01-31 Corning Incorporated Pressure-sensing touch system utilizing optical and capacitive systems
USD752048S1 (en) * 2013-01-29 2016-03-22 Aquifi, Inc. Display device with cameras
USD753658S1 (en) * 2013-01-29 2016-04-12 Aquifi, Inc. Display device with cameras
USD753657S1 (en) * 2013-01-29 2016-04-12 Aquifi, Inc. Display device with cameras
USD753656S1 (en) * 2013-01-29 2016-04-12 Aquifi, Inc. Display device with cameras
USD753655S1 (en) * 2013-01-29 2016-04-12 Aquifi, Inc Display device with cameras
USD752585S1 (en) * 2013-01-29 2016-03-29 Aquifi, Inc. Display device with cameras

Also Published As

Publication number Publication date
EP2120131B1 (en) 2015-10-21
CN101582008B (en) 2012-06-13
EP2120131A3 (en) 2012-07-04
KR20090118872A (en) 2009-11-18
JP5203797B2 (en) 2013-06-05
CN101582008A (en) 2009-11-18
US20090287999A1 (en) 2009-11-19
KR101150321B1 (en) 2012-06-08
EP2120131A2 (en) 2009-11-18
JP2009276926A (en) 2009-11-26

Similar Documents

Publication Publication Date Title
US8266529B2 (en) Information processing device and display information editing method of information processing device
US10444989B2 (en) Information processing apparatus, and input control method and program of information processing apparatus
US10671280B2 (en) User input apparatus, computer connected to user input apparatus, and control method for computer connected to user input apparatus, and storage medium
US9671893B2 (en) Information processing device having touch screen with varying sensitivity regions
US8739053B2 (en) Electronic device capable of transferring object between two display units and controlling method thereof
US8381118B2 (en) Methods and devices that resize touch selection zones while selected on a touch sensitive display
JP4372188B2 (en) Information processing apparatus and display control method
US20070126711A1 (en) Input device
WO2011101940A1 (en) Mobile terminal and control method thereof
US20110169760A1 (en) Device for control of electronic apparatus by manipulation of graphical objects on a multicontact touch screen
WO2004010276A1 (en) Information display input device and information display input method, and information processing device
US20130063385A1 (en) Portable information terminal and method for controlling same
JP5713180B2 (en) Touch panel device that operates as if the detection area is smaller than the display area of the display.
US10048726B2 (en) Display control apparatus, control method therefor, and storage medium storing control program therefor
US10564844B2 (en) Touch-control devices and methods for determining keys of a virtual keyboard
JPWO2009031213A1 (en) Portable terminal device and display control method
US20130215037A1 (en) Multi-touch surface keyboard with multi-key zones on an adaptable home line and method of inputting to same
WO2016208099A1 (en) Information processing device, input control method for controlling input upon information processing device, and program for causing information processing device to execute input control method
KR20140033726A (en) Method and apparatus for distinguishing five fingers in electronic device including touch screen
JP5995171B2 (en) Electronic device, information processing method, and information processing program
US20220342530A1 (en) Touch sensor, touch pad, method for identifying inadvertent touch event and computer device
JP2010231480A (en) Handwriting processing apparatus, program, and method
JP2018170048A (en) Information processing apparatus, input method, and program
TWI522895B (en) Interface operating method and portable electronic apparatus using the same
US11822743B2 (en) Touch panel information terminal apparatus and information input processing method implemented with dual input devices arranged on two surfaces

Legal Events

Date Code Title Description
AS Assignment

Owner name: NTT DOCOMO, INC., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:OOI, NORIKO;MURAKAMI, KEIICHI;ENDO, KENTARO;REEL/FRAME:022865/0066

Effective date: 20090518

STCF Information on status: patent grant

Free format text: PATENTED CASE

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 4

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20200911