CN105338192A - Mobile terminal and operation processing method thereof - Google Patents
Mobile terminal and operation processing method thereof Download PDFInfo
- Publication number
- CN105338192A CN105338192A CN201510833716.4A CN201510833716A CN105338192A CN 105338192 A CN105338192 A CN 105338192A CN 201510833716 A CN201510833716 A CN 201510833716A CN 105338192 A CN105338192 A CN 105338192A
- Authority
- CN
- China
- Prior art keywords
- user
- focal object
- eye motion
- display unit
- sight line
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Landscapes
- User Interface Of Digital Computer (AREA)
- Telephone Function (AREA)
Abstract
The invention discloses a mobile terminal and an operation processing method thereof. The mobile terminal comprises a display unit used for loading a graphical interface in a display area of the display unit, a detection unit used for collecting data of eye action of a representation user, and a control unit used for analyzing the data of the eye action of the representation user, determining the feature information of the eye action of the user, determining a focusing object corresponding to user sight in the graphical interface shown in the display unit, controlling the display unit to update the display state of the focusing object and controlling the display unit to display operation result of the eye action of the focusing object response user according to a command corresponding to the eye action of the user. By virtue of implementation of the mobile terminal, the user is supported to operate the mobile terminal in a manner that the eye sight is matched with the eye action, and operation is efficient and convenient.
Description
Technical field
The present invention relates to the mobile terminal operating technology of the communications field, particularly relate to a kind of mobile terminal and operation processing method thereof.
Background technology
Daily life is to the more and more dependence of the mobile terminal such as smart mobile phone, panel computer now, and the application based on mobile terminal has penetrated into the various aspects of daily life.
Correspondingly, user to the function of mobile terminal and hardware requirement also more and more higher, the display unit size of a most outstanding aspect mobile terminal is also done larger and larger, high definition resolution and wide application space is provided to user, also can the higher chip of integrated performance and longer cruising time.
But the mobile terminal of the display unit of large scale (as being greater than 5 inches) also has the deficiency of self, operation for user brings certain inconvenience, particularly user be hold by one hand or both hands all when being busy with other things, be difficult to operate, Consumer's Experience is poor.
Summary of the invention
The embodiment of the present invention provides a kind of mobile terminal and operation processing method thereof, and can support that user coordinates the mode operating mobile terminal of eye motion with the sight line of eye, it is efficiently convenient to operate.
The technical scheme of the embodiment of the present invention is achieved in that
The embodiment of the present invention provides a kind of mobile terminal, and described mobile terminal comprises:
Display unit, for loading graphical interfaces in the viewing area of described display unit;
Detecting unit, for gathering the data of the eye motion of characterizing consumer;
Control unit, for analyzing the data of the eye motion of described characterizing consumer, determines the characteristic information of the eye motion of user;
Described control unit, also determines to be in the focal object corresponding with user's sight line in the graphical interfaces that presents at described display unit, controls the show state that described display unit upgrades described focal object; And, based on described user eye motion corresponding to instruction, control the operating result of eye motion that described display unit shows described focal object response user.
The embodiment of the present invention provides a kind of operation processing method, and described method comprises:
Graphical interfaces is loaded in the viewing area of display unit;
Gather the data of the eye motion of characterizing consumer, analyze the data of the eye motion of described characterizing consumer, determine the characteristic information of the eye motion of user;
Determine to be in the focal object corresponding with user's sight line in the graphical interfaces that described display unit presents, upgrade the show state of described focal object; And, based on described user eye motion corresponding to instruction, show the operating result of eye motion of described focal object response user.
The embodiment of the present invention, user can when without manual manipulation mobile terminal, by needing the position of the object operated in eye gaze mobile terminal display unit, the object that selection will operate (can be that the entry of any type is as message, contact person etc.), the switching can carrying out display unit displaying contents by the specific action of eye as moved up and down is (as the switching of rolling form, or the switching of page turning form), by the specific action of eye, (left avertence as sight line is moved to move with right avertence and is operated focal object, as to contact person, message etc. carry out deleting or forward operation, address book contact entry is sent short messages or dial-up operation etc., improve operating efficiency.
Accompanying drawing explanation
Fig. 1 is an optional hardware configuration schematic diagram of the mobile terminal realizing each embodiment of the present invention;
Fig. 2 is the wireless communication system schematic diagram of mobile terminal as shown in Figure 1;
Fig. 3 is another the optional hardware configuration schematic diagram of mobile terminal realizing each embodiment of the present invention;
Fig. 4 is another the optional hardware configuration schematic diagram of the mobile terminal realizing each embodiment of the present invention;
Fig. 5 is the scene schematic diagram one of operational processes in the embodiment of the present invention;
Fig. 6 is the scene schematic diagram two of operational processes in the embodiment of the present invention;
Fig. 7 is the scene schematic diagram three of operational processes in the embodiment of the present invention;
Fig. 8 is the scene schematic diagram four of operational processes in the embodiment of the present invention;
Fig. 9 is the scene schematic diagram five of operational processes in the embodiment of the present invention;
Figure 10 is the schematic flow sheet one of operational processes in the embodiment of the present invention;
Figure 11 is the schematic flow sheet two of operational processes in the embodiment of the present invention.
Embodiment
Below in conjunction with drawings and Examples, the present invention is further elaborated.Should be appreciated that specific embodiment described herein only in order to explain the present invention, be not intended to limit the present invention.
The mobile terminal realizing each embodiment of the present invention is described referring now to accompanying drawing.In follow-up description, use the suffix of such as " module ", " parts " or " unit " for representing element only in order to be conducive to explanation of the present invention, itself is specific meaning not.Therefore, " module " and " parts " can mixedly use.
Mobile terminal can be implemented in a variety of manners.Such as, the terminal described in the present invention can comprise the such as mobile terminal of mobile phone, smart phone, notebook computer, digit broadcasting receiver, PDA (personal digital assistant), PAD (panel computer), PMP (portable media player), guider etc. and the fixed terminal of such as digital TV, desktop computer etc.Below, suppose that terminal is mobile terminal.But, it will be appreciated by those skilled in the art that, except being used in particular for mobile object element, structure according to the embodiment of the present invention also can be applied to the terminal of fixed type.
Fig. 1 is the hardware configuration signal of the mobile terminal realizing each embodiment of the present invention, as shown in Figure 1, mobile terminal 100 can comprise wireless communication unit 110, A/V (audio/video) input unit 120, user input unit 130, sensing cell 140, output unit 150, memory 160, interface unit 170, controller 180 and power subsystem 190 etc.Fig. 1 shows the mobile terminal with various assembly, it should be understood that, does not require to implement all assemblies illustrated.Can alternatively implement more or less assembly.Will be discussed in more detail below the element of mobile terminal.
Wireless communication unit 110 generally includes one or more assembly, and it allows the radio communication between mobile terminal 100 and wireless communication system or network.Such as, wireless communication unit can comprise at least one in broadcast reception module 111, mobile communication module 112, wireless Internet module 113, short range communication module 114 and positional information module 115.
Broadcast reception module 111 via broadcast channel from external broadcasting management server receiving broadcast signal and/or broadcast related information.Broadcast channel can comprise satellite channel and/or terrestrial channel.Broadcast management server can be generate and send the server of broadcast singal and/or broadcast related information or the broadcast singal generated before receiving and/or broadcast related information and send it to the server of terminal.Broadcast singal can comprise TV broadcast singal, radio signals, data broadcasting signal etc.And broadcast singal may further include the broadcast singal combined with TV or radio signals.Broadcast related information also can provide via mobile communications network, and in this case, broadcast related information can be received by mobile communication module 112.Broadcast singal can exist in a variety of manners, such as, it can exist with the form of the electronic service guidebooks (ESG) of the electronic program guides of DMB (DMB) (EPG), digital video broadcast-handheld (DVB-H) etc.Broadcast reception module 111 can by using the broadcast of various types of broadcast system Received signal strength.Especially, broadcast reception module 111 can by using such as multimedia broadcasting-ground (DMB-T), DMB-satellite (DMB-S), digital video broadcasting-hand-held (DVB-H), forward link media (MediaFLO
) the digit broadcasting system receiving digital broadcast of Radio Data System, received terrestrial digital broadcasting integrated service (ISDB-T) etc.Broadcast reception module 111 can be constructed to be applicable to providing the various broadcast system of broadcast singal and above-mentioned digit broadcasting system.The broadcast singal received via broadcast reception module 111 and/or broadcast related information can be stored in memory 160 (or storage medium of other type).
Radio signal is sent at least one in base station (such as, access point, Node B etc.), exterior terminal and server and/or receives radio signals from it by mobile communication module 112.Various types of data that such radio signal can comprise voice call signal, video calling signal or send according to text and/or Multimedia Message and/or receive.
Wireless Internet module 113 supports the Wi-Fi (Wireless Internet Access) of mobile terminal.This module can be inner or be externally couple to terminal.Wi-Fi (Wireless Internet Access) technology involved by this module can comprise WLAN (WLAN) (Wi-Fi), Wibro (WiMAX), Wimax (worldwide interoperability for microwave access), HSDPA (high-speed downlink packet access) etc.
Short range communication module 114 is the modules for supporting junction service.Some examples of short-range communication technology comprise bluetooth
tM, radio-frequency (RF) identification (RFID), Infrared Data Association (IrDA), ultra broadband (UWB), purple honeybee
tMetc..
Positional information module 115 is the modules of positional information for checking or obtain mobile terminal.The typical case of positional information module is GPS (global positioning system).According to current technology, GPS module 115 calculates from the range information of three or more satellite and correct time information and for the Information application triangulation calculated, thus calculates three-dimensional current location information according to longitude, latitude and pin-point accuracy.Current, the method for calculating location and temporal information uses three satellites and by the error of the position that uses an other satellite correction calculation to go out and temporal information.In addition, GPS module 115 can carry out computational speed information by Continuous plus current location information in real time.
A/V input unit 120 is for audio reception or vision signal.A/V input unit 120 can comprise camera 121 and microphone 1220, and the view data of camera 121 to the static images obtained by image capture apparatus in Video Capture pattern or image capture mode or video processes.Picture frame after process may be displayed on display unit 151.Picture frame after camera 121 processes can be stored in memory 160 (or other storage medium) or via wireless communication unit 110 and send, and can provide two or more cameras 1210 (such as front-facing camera and post-positioned pick-up head) according to the structure of mobile terminal.Such acoustic processing can via microphones sound (audio frequency) in telephone calling model, logging mode, speech recognition mode etc. operational mode, and can be audio frequency by microphone 122.Audio frequency (voice) data after process can be converted to the formatted output that can be sent to mobile communication base station via mobile communication module 112 when telephone calling model.Microphone 122 can be implemented various types of noise and eliminate (or suppress) algorithm and receiving and sending to eliminate (or suppression) noise or interference that produce in the process of audio signal.
User input unit 130 can generate key input data to control the various operations of mobile terminal according to the order of user's input.User input unit 130 allows user to input various types of information, and keyboard, the young sheet of pot, touch pad (such as, detecting the touch-sensitive assembly of the change of the resistance, pressure, electric capacity etc. that cause owing to being touched), roller, rocking bar etc. can be comprised.Especially, when touch pad is superimposed upon on display unit 151 as a layer, touch-screen can be formed.
Sensing cell 140 detects the current state of mobile terminal 100, (such as, mobile terminal 100 open or close state), the position of mobile terminal 100, user for mobile terminal 100 contact (namely, touch input) presence or absence, the orientation of mobile terminal 100, the acceleration or deceleration of mobile terminal 100 move and direction etc., and generate order or the signal of the operation for controlling mobile terminal 100.Such as, when mobile terminal 100 is embodied as sliding-type mobile phone, sensing cell 140 can sense this sliding-type phone and open or close.In addition, whether whether sensing cell 140 can detect power subsystem 190 provides electric power or interface unit 170 to couple with external device (ED).Sensing cell 140 can comprise proximity transducer 141 and will be described this in conjunction with touch-screen below, and sensing cell 140 can also comprise infrared sensor.
Interface unit 170 is used as at least one external device (ED) and is connected the interface that can pass through with mobile terminal 100.Such as, external device (ED) can comprise wired or wireless head-band earphone port, external power source (or battery charger) port, wired or wireless FPDP, memory card port, for connecting the port, audio frequency I/O (I/O) port, video i/o port, ear port etc. of the device with identification module.Identification module can be that storage uses the various information of mobile terminal 100 for authentication of users and can comprise subscriber identification module (UIM), client identification module (SIM), Universal Subscriber identification module (USIM) etc.In addition, the device (hereinafter referred to " recognition device ") with identification module can take the form of smart card, and therefore, recognition device can be connected with mobile terminal 100 via port or other jockey.Interface unit 170 may be used for receive from external device (ED) input (such as, data message, electric power etc.) and the input received be transferred to the one or more element in mobile terminal 100 or may be used for transmitting data between mobile terminal and external device (ED).
In addition, when mobile terminal 100 is connected with external base, interface unit 170 can be used as to allow by it electric power to be provided to the path of mobile terminal 100 from base or can be used as the path that allows to be transferred to mobile terminal by it from the various command signals of base input.The various command signal inputted from base or electric power can be used as and identify whether mobile terminal is arranged on the signal base exactly.Output unit 150 is constructed to provide output signal (such as, audio signal, vision signal, alarm signal, vibration signal etc.) with vision, audio frequency and/or tactile manner.Output unit 150 can comprise display unit 151, dio Output Modules 152, alarm unit 153 etc.
Display unit 151 may be displayed on the information of process in mobile terminal 100.Such as, when mobile terminal 100 is in telephone calling model, display unit 151 can show with call or other communicate (such as, text messaging, multimedia file are downloaded etc.) be correlated with user interface (UI) or graphic user interface (GUI).When mobile terminal 100 is in video calling pattern or image capture mode, display unit 151 can the image of display capture and/or the image of reception, UI or GUI that video or image and correlation function are shown etc.
Meanwhile, when display unit 151 and touch pad as a layer superposed on one another to form touch-screen time, display unit 151 can be used as input unit and output device.Display unit 151 can comprise at least one in liquid crystal display (LCD), thin-film transistor LCD (TFT-LCD), Organic Light Emitting Diode (OLED) display, flexible display, three-dimensional (3D) display etc.Some in these displays can be constructed to transparence and watch from outside to allow user, and this can be called transparent display, and typical transparent display can be such as TOLED (transparent organic light emitting diode) display etc.According to the specific execution mode wanted, mobile terminal 100 can comprise two or more display units (or other display unit), such as, mobile terminal can comprise outernal display unit (not shown) and inner display unit (not shown).Touch-screen can be used for detecting touch input pressure and touch input position and touch and inputs area.
When dio Output Modules 152 can be under the isotypes such as call signal receiving mode, call mode, logging mode, speech recognition mode, broadcast reception mode at mobile terminal, voice data convert audio signals that is that wireless communication unit 110 is received or that store in memory 160 and exporting as sound.And dio Output Modules 152 can provide the audio frequency relevant to the specific function that mobile terminal 100 performs to export (such as, call signal receives sound, message sink sound etc.).Dio Output Modules 152 can comprise loud speaker, buzzer etc.
Alarm unit 153 can provide and export that event informed to mobile terminal 100.Typical event can comprise calling reception, message sink, key signals input, touch input etc.Except audio or video exports, alarm unit 153 can provide in a different manner and export with the generation of notification event.Such as, the form that alarm unit 153 can be vibrated provides output, when receive calling, message or some other enter communication (incomingcommunication) time, alarm unit 153 can provide sense of touch to export (that is, vibrating) to notify to user.By providing such sense of touch to export, even if when the mobile phone of user is in the pocket of user, user also can identify the generation of various event.Alarm unit 153 also can provide the output of the generation of notification event via display unit 151 or dio Output Modules 152.
Memory 160 software program that can store process and the control operation performed by controller 180 etc., or temporarily can store the data (such as, telephone directory, message, still image, video etc.) having exported and maybe will export.And, memory 160 can store about when touch be applied to touch-screen time the vibration of various modes that exports and the data of audio signal.
Memory 160 can comprise the storage medium of at least one type, described storage medium comprises flash memory, hard disk, multimedia card, card-type memory (such as, SD or DX memory etc.), random access storage device (RAM), static random-access memory (SRAM), read-only memory (ROM), Electrically Erasable Read Only Memory (EEPROM), programmable read only memory (PROM), magnetic storage, disk, CD etc.And mobile terminal 100 can be connected the memory function of execute store 160 network storage device with by network cooperates.
Controller 180 controls the overall operation of mobile terminal usually.Such as, controller 180 performs the control relevant to voice call, data communication, video calling etc. and process.In addition, controller 180 can comprise for reproducing or the multi-media module 181 of multimedia playback data, and multi-media module 181 can be configured in controller 180, or can be configured to be separated with controller 180.Controller 180 can pattern recognition process, is identified as character or image so that input is drawn in the handwriting input performed on the touchscreen or picture.
Power subsystem 190 receives external power or internal power and provides each element of operation and the suitable electric power needed for assembly under the control of controller 180.
Various execution mode described herein can to use such as computer software, the computer-readable medium of hardware or its any combination implements.For hardware implementation, execution mode described herein can by using application-specific IC (ASIC), digital signal processor (DSP), digital signal processing device (DSPD), programmable logic device (PLD), field programmable gate array (FPGA), processor, controller, microcontroller, microprocessor, being designed at least one performed in the electronic unit of function described herein and implementing, in some cases, such execution mode can be implemented in controller 180.For implement software, the execution mode of such as process or function can be implemented with allowing the independent software module performing at least one function or operation.Software code can be implemented by the software application (or program) write with any suitable programming language, and software code can be stored in memory 160 and to be performed by controller 180.
So far, the mobile terminal according to its functional description.Below, for the sake of brevity, by the slide type mobile terminal that describes in various types of mobile terminals of such as folded form, board-type, oscillating-type, slide type mobile terminal etc. exemplarily.Therefore, the present invention can be applied to the mobile terminal of any type, and is not limited to slide type mobile terminal.
Mobile terminal 100 as shown in Figure 1 can be constructed to utilize and send the such as wired and wireless communication system of data via frame or grouping and satellite-based communication system operates.
Describe wherein according to the communication system that mobile terminal of the present invention can operate referring now to Fig. 2.
Such communication system can use different air interfaces and/or physical layer.Such as, the air interface used by communication system comprises such as frequency division multiple access (FDMA), time division multiple access (TDMA), code division multiple access (CDMA) and universal mobile telecommunications system (UMTS) (especially, Long Term Evolution (LTE)), global system for mobile communications (GSM) etc.As non-limiting example, description below relates to cdma communication system, but such instruction is equally applicable to the system of other type.
With reference to figure 2, cdma wireless communication system can comprise multiple mobile terminal 100, multiple base station (BS) 270, base station controller (BSC) 275 and mobile switching centre (MSC) 280.MSC280 is constructed to form interface with Public Switched Telephony Network (PSTN) 290.MSC280 is also constructed to form interface with the BSC275 that can be couple to base station 270 via back haul link.Back haul link can construct according to any one in some interfaces that oneself knows, described interface comprises such as E1/T1, ATM, IP, PPP, frame relay, HDSL, ADSL or xDSL.Will be appreciated that system as shown in Figure 2 can comprise multiple BSC275.
Each BS270 can serve one or more subregion (or region), by multidirectional antenna or point to specific direction each subregion of antenna cover radially away from BS270.Or each subregion can by two or more antenna covers for diversity reception.Each BS270 can be constructed to support multiple parallel compensate, and each parallel compensate has specific frequency spectrum (such as, 1.25MHz, 5MHz etc.).
Subregion can be called as CDMA Channel with intersecting of parallel compensate.BS270 also can be called as base station transceiver subsystem (BTS) or other equivalent terms.Under these circumstances, term " base station " may be used for broadly representing single BSC275 and at least one BS270.Base station also can be called as " cellular station ".Or each subregion of particular B S270 can be called as multiple cellular station.
As shown in Figure 2, broadcast singal is sent to the mobile terminal 100 at operate within systems by broadcsting transmitter (BT) 295.Broadcast reception module 111 as shown in Figure 1 is arranged on mobile terminal 100 and sentences the broadcast singal receiving and sent by BT295.In fig. 2, several global positioning system (GPS) satellite 300 is shown.Satellite 300 helps at least one in the multiple mobile terminal 100 in location.
In fig. 2, depict multiple satellite 300, but understand, the satellite of any number can be utilized to obtain useful locating information.GPS module 115 as shown in Figure 1 is constructed to coordinate to obtain the locating information wanted with satellite 300 usually.Substitute GPS tracking technique or outside GPS tracking technique, can use can other technology of position of tracking mobile terminal.In addition, at least one gps satellite 300 optionally or extraly can process satellite dmb transmission.
As a typical operation of wireless communication system, BS270 receives the reverse link signal from various mobile terminal 100.Mobile terminal 100 participates in call usually, information receiving and transmitting communicates with other type.Each reverse link signal that certain base station 270 receives is processed by particular B S270.The data obtained are forwarded to relevant BSC275.BSC provides call Resourse Distribute and comprises the mobile management function of coordination of the soft switching process between BS270.The data received also are routed to MSC280 by BSC275, and it is provided for the extra route service forming interface with PSTN290.Similarly, PSTN290 and MSC280 forms interface, and MSC and BSC275 forms interface, and BSC275 correspondingly control BS270 so that forward link signals is sent to mobile terminal 100.
Embodiment one
See Fig. 3, the mobile terminal 100 that the present embodiment is recorded comprises:
Interface unit 170, power subsystem 190 and display unit 151; The function declaration of interface unit 170, power subsystem 190 and display unit 151, see aforesaid explanation, repeats no more here;
Mobile terminal 100 also comprises detecting unit 400 and control unit 500;
Wherein, display unit 151 for loading graphical interfaces in the viewing area of display unit 151, the graphical interfaces loaded can be any function interface of the operating system of running of mobile terminal, and the graphical interfaces as desktop starter (includes the mark of the application of installing in mobile terminal, arranges interface (function choosing-item such as network settings, the contact person that comprise mobile terminal divide into groups to arrange) etc.; Also can be the interface of the application (as address list application, social application) of installing in mobile terminal.
Detecting unit 400, for gathering the data of the eye motion of characterizing consumer, is described below in conjunction with example:
Example 1) see Fig. 3, detecting unit 400 can be realized by the camera 121 (as front-facing camera) in mobile terminal, camera 121 and infrared sensor 142 and infrared light supply (can be integrated in camera 121, also can arrange in the terminal independent of camera 121) realize, the infrared ray that the zones of different reflection that camera 121 gathers user's eye comes, when the eye of infrared light source irradiation user, reflection image is formed at the anterior corneal surface of eye, obtain the relative position relation (characterizing the data of eye motion) of reflection image that infrared light supply formed at iris and pupil.
Example 2) see Fig. 4, detecting unit 400 can be realized by the infrared light supply 401 in mobile terminal and camera 121 infrared light of detection of reflected (can), infrared light supply 401 irradiates the iris of eye and the boundary of sclera, when eyeball is to while when moving, camera detects the reverberation (characterizing the data of eye motion) from the both sides of eyes.
Example 3) detecting unit 400 can by camera 121 (can detect the image comprising eye profile or the pupil position) realization in mobile terminal.
Control unit 500, for the data of the eye motion of analysis and characterization user, determines the characteristic information of the eye motion of user; Control unit 500 can be realized by aforesaid controller 180;
Determine the characteristic information of eye motion, realized by following several mode:
1) in aforementioned exemplary 1) in, position due to infrared light supply is changeless, and the eye of user (eyeball) is approximate spheroid, when Rotation of eyeball, will be there is corresponding change in the position of iris and pupil, thus the relative position relation (data of eye motion) of the reflection image that formed at iris by infrared light supply of controller and pupil judges the direction of user's sight line;
2) in aforementioned exemplary 2) in, when eyes are to when moving, the reverberation that control unit 500 analyzes from one side of eyes is fewer than the reverberation of the another side from eyes, thus can judge the direction of user's sight line.
3) in aforementioned exemplary 2) in, based on following principle: large when the profile ratio of user's eye when looking squarely or look up is overlooked; Thus control unit 500 goes out the size of eye socket according to the eye graphics calculations that camera 121 gathers, then utilize user to look up peaceful apparent time, superior orbit, to the difference of the ratio of the distance of eye center, distinguishes and looks up and look squarely; Calculate all pupils to the quadratic sum of the distance of eye socket left hand edge and all pupils to the ratio of the quadratic sum of the distance of eye socket right hand edge, the difference based on ratio distinguishes the position of looking up the left, center, right of peaceful apparent time sight line; Image pre-filtering is implemented to the image that camera 121 gathers, after the qualified figure of filtration is carried out equilibrium, binaryzation, extracts the image of canthus and pupil, thus the direction of visual lines that judgement is directed one's eyes downward when seeing.
4) in aforementioned exemplary 3) in, based on following principle, when eyes of user watches different directions attentively, the position of pupil is different; Control unit 500 analyzes (comprising pupil) position of pupil when user faces based on the image of camera collection, the difference of eyes image in image when the image sequence of camera subsequent acquisition and user are faced, obtain the angle that eyeball turns over, and then calculate user's direction of visual lines.
Control unit 500, the graphical interfaces also for presenting at display unit 151 is determined to be in the focal object corresponding with user's sight line, controls the show state that display unit 151 upgrades focal object; Focal object is object corresponding with the drop point of user's sight line in graphical interfaces, and focal object can be a piece of news in the contacts list characterizing the icon of an application, social application in list of application in the list items of a corresponding contact person or the messaging list of social application; When after the focal object determining user's sight line, upgrade the show state of focal object, namely change focal object realized focusing on by user before adopt show state (as adopted marking frame to focal object or adding the mode of special efficacy, with region with not by the object of user's focus vision); Determine corresponding instruction according to the action that the follow-up eye of user is implemented, control the operating result that display unit 151 shows the eye motion of focal object response user.
Be described below in conjunction with example.
See Fig. 5 and Fig. 6, messaging list is had in the graphical interfaces of social application, the some message of user's focus vision (being set to message 2), the focal object that the data judging of the characterizing consumer eye motion that control unit 500 gathers based on detecting unit 400 goes out user's sight line is message 2, message 2 is highlighted in the mode of marking frame, follow-up, user provides by implementing specific eye motion the instruction operated message 2:
Trigger control unit 500 identifies the instruction of replying message 2 as user's focus vision message 2 and when being moved to the left, then control unit 500 controls display unit 151 and presents and reply interface for message 2;
Trigger control unit 500 identifies the instruction of deleting message 2 as user's focus vision message 2 and when moving right, then control unit 500 controls display unit 151 and presents the messaging list deleting message 2.
See Fig. 7 and Fig. 8, contacts list is had in the graphical interfaces of social application, the some contact persons of user's focus vision (being set to contact person 2), the data judging of the characterizing consumer eye motion that control unit 500 gathers based on detecting unit 400 goes out the focal object of user's sight line for contact person 2, highlights contact person 2 in the mode of marking frame; Follow-up, user provides by implementing specific eye motion the instruction operated contact person 2:
Trigger control unit 500 identifies the instruction of dialing contact person 2 as user's focus vision contact person 2 and when being moved to the left, then control unit 500 controls display unit 151 and presents the graphical interfaces dialing contact person 2 phone;
Trigger control unit 500 identifies the instruction sending message to contact person 2 as user's focus vision contact person 2 and when moving right, then control unit 500 controls display unit 151 and presents message inputting interface for contact person 2, can also enable language recognition function to support that user speech inputs the message of contact person 2.
It is to be noted, the association of above-mentioned eye motion and instruction is only example, the operational order that can pre-set in different applications during actual enforcement in different eye motion and application associates, and can function be set, support that user arranges the instruction corresponding from different eye motion according to the use habit of self in each application, or in the terminal setting of overall importance is carried out to the instruction corresponding from different eye motion.
The present embodiment realizes following beneficial effect: user can when without manual manipulation mobile terminal, by needing the position of the object operated in eye gaze mobile terminal display unit 151, the object that selection will operate (can be that the entry of any type is as message, contact person etc.), the switching can carrying out display unit 151 displaying contents by the specific action of eye as moved up and down is (as the switching of rolling form, or the switching of page turning form), by the specific action of eye, (left avertence as sight line is moved to move with right avertence and is operated focal object, as correspondence message deleted or reply operation, contact person is sent short messages or dial-up operation etc., improve operating efficiency.
Embodiment two
The mobile terminal that the present embodiment is recorded can adopt the rigid structure shown in the arbitrary hardware configuration schematic diagram of Fig. 3 or Fig. 4 in previous embodiment, can see the explanation of previous embodiment for hardware configuration, and the present embodiment repeats no more; The present embodiment is to determining in previous embodiment one that the mode of focal object is described.
Mode 1) direction of visual lines of user that control unit 500 characterizes based on the characteristic information of user's eye motion, determine the position (such as adopt the form of coordinate) of user's direction of visual lines in display unit 151 correspondence; Position-based and the distance threshold determination focal object screening areas (wherein focal object screening areas above rheme can be set to center, take distance threshold as the circle of radius) preset; Be focal object based on the Object identifying being positioned at focal object screening areas in the object that the graphical interfaces that viewing area loads comprises by focal object recognition strategy.
Such as, when the position of user's sight line in display unit 151 correspondence is set to (x, y), in the graphical interfaces that display unit 151 loads, the position of a certain object is set to
setpoint distance threshold value TH, judges user sight line whether focal object
adopt following condition:
If meet this condition, then think user's focus vision object
Mode 2) control unit 500 sight line that detects user is positioned at duration in focal object region, judge whether the duration that the sight line of user is positioned at focal object region is greater than preset time threshold, if so, then the Object identifying being positioned at focal object region in the object comprised by the graphical interfaces that viewing area loads is focal object; Otherwise the sight line continuing to detect user is positioned at the duration in focal object region, until the duration is greater than preset time threshold, or, detect that the sight line of user is positioned at outside the viewing area of display unit 151.
Such as, detect sight line and remain on the time of staying in screen ranges, if more than 2s (preset time threshold), then think that the object on this position is that user wishes that by the object of focus vision be also focal object.;
Focal object can also be identified in conjunction with above-mentioned two kinds of modes in practical application, namely, corresponding (when namely meeting the condition of above-mentioned formula (1) with an object's position in the position of display unit 151 in user's sight line, also need to judge whether the time that this object of user's sight line stops exceeds preset time threshold, if exceeded, then determine that this object is the object that user expects by focus vision.
The present embodiment realizes following beneficial effect: whether exceed focal time threshold value mode one of at least by focal object screening areas and duration, focal object is judged, wherein the mode of focal object screening areas can have the fault-tolerance of movement, still accurately can judge focal object even if the sight line of user slightly offsets; The mode whether duration exceeds focal time threshold value can avoid the situation because user's sight line causes focal object judge to not needing the object focused on to make a brief stay, and improves the precision of detection focal object.
Embodiment three
The mobile terminal that the present embodiment is recorded can adopt the rigid structure shown in the arbitrary hardware configuration schematic diagram of Fig. 3 or Fig. 4 in previous embodiment, can see the explanation of previous embodiment for hardware configuration, and the present embodiment repeats no more; User by after focus vision object, the object of all right switching focusing, or by specific eye motion, focal object is operated, the present embodiment is described the process of control unit 500 for above-mentioned situation.
Control unit 500 is based on the characteristic information switching focusing object of the eye motion of user, or display focal object responds the operating result of the change of the eye motion of user.
Such as, when the sight line (at the drop point that graphical interfaces is corresponding) of the characteristic information characterizing consumer of the eye motion of user departs from focal object predeterminable range based on the first reference direction (direction of object arranged distribution is as the y-axis direction of two-dimensional coordinate system), the parameter (comprising speed and distance) that the sight line based on user changes controls display unit 151 and switches display focal object;
See Fig. 9, (time near top edge or lower limb, (for top edge, exceed submarginal entry height h's when user's sight line moves to display unit 151 edge
namely apart from the distance at edge
time; Lower limb in like manner, exceedes submarginal entry height h's
namely apart from the distance at edge
time), upwards (downwards) rolling is to show new object automatically for control unit 500, and rolling speed can adjust according to actual conditions, then determines new focal object according to the characteristic information of the eye motion of user.
User's eye motion characteristic information characterizing consumer sight line based on the second reference direction (direction of object arranged distribution is as the y-axis direction of two-dimensional coordinate system) change meet pre-conditioned (movement of left or right) time, control unit 500 controls the operational order information of display unit 151 show needle to focal object, instruction that such as display is about to perform for object is (as the deletion of note or reply, transmission note or dialing etc. for contact person), and the eye motion of characteristic information characterizing consumer in user's eye motion (as nictation) is when meeting preset times, determine that user confirms the instruction being about to perform focal object, control the operating result that display unit 151 shows the operation response instruction of focal object.
The present embodiment realizes following beneficial effect: support the object realizing switching focusing according to user, facilitates user to carry out switching select when not including the object of user's desired operation in the graphical interfaces that display unit 151 loads; Further, when the instruction of the eye motion identifying user for focal object, provide the information of this instruction, avoid the misoperation for focal object, improve the accuracy of operation.
Embodiment four
The present embodiment records a kind of operational processes device, can be arranged in mobile terminal to support user by eye motion with efficient mode operating mobile terminal easily, to comprise:
Display unit 151, for loading graphical interfaces in the viewing area of described display unit 151;
Detecting unit 400, for gathering the data of the eye motion of characterizing consumer;
Control unit 500, for analyzing the data of the eye motion of described characterizing consumer, determines the characteristic information of the eye motion of user;
Described control unit 500, also determines to be in the focal object corresponding with user's sight line in the graphical interfaces that presents at described display unit 151, controls the show state that described display unit 151 upgrades described focal object; And, based on described user eye motion corresponding to instruction, control the operating result of eye motion that described display unit 151 shows described focal object response user.
Described control unit 500, the direction of visual lines of user also for characterizing based on the characteristic information of user's eye motion, determines the position of user's direction of visual lines in described display unit 151 correspondence; Based on described position and default distance threshold determination focal object screening areas; Be described focal object based on the Object identifying being positioned at described focal object screening areas in the object that the graphical interfaces that described viewing area loads comprises by focal object recognition strategy.
Described control unit 500, sight line also for detecting described user is positioned at the duration in described focal object region, judge whether the duration that the sight line of user is positioned at described focal object region is greater than preset time threshold, if so, then the Object identifying being positioned at described focal object region in the object comprised by the graphical interfaces that described viewing area loads is described focal object; Otherwise the sight line continuing to detect user is positioned at the duration in described focal object region, until the described duration is greater than described preset time threshold, or, detect that the sight line of user is positioned at outside the viewing area of described display unit 151.
Described control unit 500, the characteristic information also for the eye motion based on user switches described focal object, or, show the operating result of the change of the eye motion of described focal object response user.
Described control unit 500, when also departing from described focal object predeterminable range for the sight line (drop point that graphical interfaces is corresponding) of the characteristic information characterizing consumer of the eye motion of user based on the first reference direction, switch the described focal object of display based on display unit 151 described in the state modulator that the sight line of user changes;
Also for user's eye motion characteristic information characterizing consumer sight line based on the second reference direction change meet pre-conditioned time, control described display unit 151 show needle to the operational order information of described focal object, and when the eye motion of the characteristic information characterizing consumer of user's eye motion meets preset times, control the operating result of the described operational order of response that described display unit 151 shows described focal object.
The present embodiment realizes following beneficial effect: user can when without manual manipulation mobile terminal, by needing the position of the object operated in eye gaze mobile terminal display unit 151, the object that selection will operate (can be that the entry of any type is as message, contact person etc.), the switching can carrying out display unit 151 displaying contents by the specific action of eye as moved up and down is (as the switching of rolling form, or the switching of page turning form), by the specific action of eye, (left avertence as sight line is moved to move with right avertence and is operated focal object, as correspondence message deleted or reply operation, contact person is sent short messages or dial-up operation etc., improve operating efficiency.
Embodiment five
The present embodiment records a kind of operation processing method, see Figure 10, comprises the following steps:
Step 101, loads graphical interfaces in the viewing area of display unit.
Step 102, gathers the data of the eye motion of characterizing consumer, analyzes the data of the eye motion of described characterizing consumer, determine the characteristic information of the eye motion of user.
Step 103, determines to be in the focal object corresponding with user's sight line in the graphical interfaces that described display unit presents.
Step 1031, the direction of visual lines of the user that the characteristic information based on user's eye motion characterizes, determines that user's direction of visual lines is in position corresponding to described display unit.
Step 1032, based on described position and default distance threshold determination focal object screening areas.
Step 1033 is described focal object based on the Object identifying being positioned at described focal object screening areas in the object that the graphical interfaces that described viewing area loads comprises by focal object recognition strategy.
The sight line detecting described user is positioned at the duration in described focal object region, judge whether the duration that the sight line of user is positioned at described focal object region is greater than preset time threshold, if so, then the Object identifying being positioned at described focal object region in the object comprised by the graphical interfaces that described viewing area loads is described focal object; Otherwise the sight line continuing to detect user is positioned at the duration in described focal object region, until the described duration is greater than described preset time threshold, or, detect that the sight line of user is positioned at outside the viewing area of described display unit.
Step 104, upgrades the show state of described focal object.
Step 105, based on described user eye motion corresponding to instruction, show the operating result of eye motion of described focal object response user.
Characteristic information based on the eye motion of user switches described focal object, or, show the operating result of the change of the eye motion of described focal object response user.
Such as, when the sight line of the characteristic information characterizing consumer of the eye motion of user departs from described focal object predeterminable range based on the first reference direction, the parameter switching that the sight line based on user changes shows described focal object;
User's eye motion characteristic information characterizing consumer sight line based on the second reference direction change meet pre-conditioned time, show needle is to the operational order information of described focal object, and when the eye motion of the characteristic information characterizing consumer of user's eye motion meets preset times, show the operating result of the described operational order of response of described focal object.
Combine again and specifically the operation of mobile terminal be described, see Figure 11, comprise the following steps:
Step 201, mobile terminal opens line of sight operation function.
Step 202, detects the direction of visual lines of user.
Step 203, detects the direction of visual lines of user in the region whether position of the display unit of mobile terminal rests on a certain object; If so, then step 204 is performed; Otherwise, return step 202.
Step 204, detects user's sight line and whether exceeds 2s (preset time threshold) in the time of staying in the region of this object, if so, then performs step 205; Otherwise, return step 202.
This Object identifying is focal object by step 205, uses the mode of marking frame to show focal object.
Based on user's eye motion, step 206, judges that whether focal object is the object of user's desired operation; If so, then step 208 is performed; Otherwise, perform step 207 and return step 206.
Step 207, opens roll screen, in the graphical interfaces of display unit, loads new object.
Step 208, if the sight line generation left-hand skew of user, performs deletion action or the dial-up operation of focal object.
When display unit load be the messaging list of social application time, perform the deletion action of focal object; When display unit load be the contacts list of social application time, perform the dial-up operation of focal object.
Step 209, if the sight line generation dextrad skew of user, performs the forwarding operation of focal object or the transmit operation of message.
When display unit load be the messaging list of social application time, perform the forwarding operation of focal object; When display unit load be the contacts list of social application time, perform the short message sending operation of focal object.
In sum, user can when without manual manipulation mobile terminal, by needing the position of the object operated in eye gaze mobile terminal display unit 151, the object that selection will operate (can be that the entry of any type is as message, contact person etc.), the switching can carrying out display unit 151 displaying contents by the specific action of eye as moved up and down is (as the switching of rolling form, or the switching of page turning form), by the specific action of eye, (left avertence as sight line is moved to move with right avertence and is operated focal object, as correspondence message deleted or reply operation, contact person is sent short messages or dial-up operation etc., improve operating efficiency.
One of ordinary skill in the art will appreciate that: all or part of step realizing said method embodiment can have been come by the hardware that program command is relevant, aforesaid program can be stored in a computer read/write memory medium, this program, when performing, performs the step comprising said method embodiment; And aforesaid storage medium comprises: movable storage device, random access memory (RAM, RandomAccessMemory), read-only memory (ROM, Read-OnlyMemory), magnetic disc or CD etc. various can be program code stored medium.
Or, if the above-mentioned integrated unit of the present invention using the form of software function module realize and as independently production marketing or use time, also can be stored in a computer read/write memory medium.Based on such understanding, the technical scheme of the embodiment of the present invention can embody with the form of software product the part that correlation technique contributes in essence in other words, this computer software product is stored in a storage medium, comprises some instructions and performs all or part of of method described in each embodiment of the present invention in order to make a computer equipment (can be personal computer, server or the network equipment etc.).And aforesaid storage medium comprises: movable storage device, RAM, ROM, magnetic disc or CD etc. various can be program code stored medium.
The above; be only the specific embodiment of the present invention, but protection scope of the present invention is not limited thereto, is anyly familiar with those skilled in the art in the technical scope that the present invention discloses; change can be expected easily or replace, all should be encompassed within protection scope of the present invention.Therefore, protection scope of the present invention should be as the criterion with the protection range of described claim.
Claims (10)
1. a mobile terminal, is characterized in that, described mobile terminal comprises:
Display unit, for loading graphical interfaces in the viewing area of described display unit;
Detecting unit, for gathering the data of the eye motion of characterizing consumer;
Control unit, for analyzing the data of the eye motion of described characterizing consumer, determines the characteristic information of the eye motion of user;
Described control unit, also determines to be in the focal object corresponding with user's sight line in the graphical interfaces that presents at described display unit, controls the show state that described display unit upgrades described focal object; And, based on described user eye motion corresponding to instruction, control the operating result of eye motion that described display unit shows described focal object response user.
2. mobile terminal as claimed in claim 1, is characterized in that,
Described control unit, the direction of visual lines of user also for characterizing based on the characteristic information of user's eye motion, determines that user's direction of visual lines is in position corresponding to described display unit; Based on described position and default distance threshold determination focal object screening areas; Based on focal object recognition strategy, the Object identifying being positioned at described focal object screening areas in the object that the graphical interfaces that described viewing area loads is comprised is described focal object.
3. mobile terminal as claimed in claim 2, is characterized in that,
Described control unit, sight line also for detecting described user is positioned at the duration in described focal object region, judge whether the duration that the sight line of user is positioned at described focal object region is greater than preset time threshold, if so, then the Object identifying being positioned at described focal object region in the object comprised by the graphical interfaces that described viewing area loads is described focal object; Otherwise the sight line continuing to detect user is positioned at the duration in described focal object region, until the described duration is greater than described preset time threshold, or, detect that the sight line of user is positioned at outside the viewing area of described display unit.
4. mobile terminal as claimed in claim 1, is characterized in that,
Described control unit, the characteristic information also for the eye motion based on user switches described focal object, or, show the operating result of the change of the eye motion of described focal object response user.
5. mobile terminal as claimed in claim 4, is characterized in that,
Described control unit, when also departing from described focal object predeterminable range for the sight line of the characteristic information characterizing consumer of the eye motion user based on the first reference direction, switch the described focal object of display based on display unit described in the state modulator that the sight line of user changes;
Described control unit, also for user's eye motion characteristic information characterizing consumer sight line based on the second reference direction change meet pre-conditioned time, control the operational order information of described display unit show needle to described focal object, and when the eye motion of the characteristic information characterizing consumer of user's eye motion meets preset times, control the operating result that described display unit shows the described operational order of response of described focal object.
6. an operation processing method, is characterized in that, described method comprises:
Graphical interfaces is loaded in the viewing area of display unit;
Gather the data of the eye motion of characterizing consumer, analyze the data of the eye motion of described characterizing consumer, determine the characteristic information of the eye motion of user;
Determine to be in the focal object corresponding with user's sight line in the graphical interfaces that described display unit presents, upgrade the show state of described focal object; And, based on described user eye motion corresponding to instruction, show the operating result of eye motion of described focal object response user.
7. operation processing method as claimed in claim 6, is characterized in that, determines to be in the focal object corresponding with user's sight line, comprising in the described graphical interfaces presented at display unit:
Based on the direction of visual lines of the user that the characteristic information of user's eye motion characterizes, determine that user's direction of visual lines is in position corresponding to described display unit;
Based on described position and default distance threshold determination focal object screening areas;
Based on focal object recognition strategy, the Object identifying being positioned at described focal object screening areas in the object that the graphical interfaces that described viewing area loads is comprised is described focal object.
8. operation processing method as claimed in claim 7, it is characterized in that, the Object identifying being positioned at described focal object screening areas in the described object comprised by the graphical interfaces that described viewing area loads based on focal object recognition strategy is described focal object, comprising:
The sight line detecting described user is positioned at the duration in described focal object region, judge whether the duration that the sight line of user is positioned at described focal object region is greater than preset time threshold, if so, then the Object identifying being positioned at described focal object region in the object comprised by the graphical interfaces that described viewing area loads is described focal object; Otherwise the sight line continuing to detect user is positioned at the duration in described focal object region, until the described duration is greater than described preset time threshold, or, detect that the sight line of user is positioned at outside the viewing area of described display unit.
9. operation processing method as claimed in claim 6, is characterized in that, the instruction corresponding to the described eye motion based on described user, shows the operating result of the eye motion of described focal object response user, comprising:
Characteristic information based on the eye motion of user switches described focal object, or, show the operating result of the change of the eye motion of described focal object response user.
10. operation processing method as claimed in claim 9, it is characterized in that, the characteristic information of the described eye motion based on user switches described focal object, or, show the operating result of the change of the eye motion of described focal object response user, comprising:
When the sight line of the characteristic information characterizing consumer of the eye motion of user departs from described focal object predeterminable range based on the first reference direction, the parameter switching that the sight line based on user changes shows described focal object;
User's eye motion characteristic information characterizing consumer sight line based on the second reference direction change meet pre-conditioned time, show needle is to the operational order information of described focal object, and when the eye motion of the characteristic information characterizing consumer of user's eye motion meets preset times, show the operating result of the described operational order of response of described focal object.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510833716.4A CN105338192A (en) | 2015-11-25 | 2015-11-25 | Mobile terminal and operation processing method thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510833716.4A CN105338192A (en) | 2015-11-25 | 2015-11-25 | Mobile terminal and operation processing method thereof |
Publications (1)
Publication Number | Publication Date |
---|---|
CN105338192A true CN105338192A (en) | 2016-02-17 |
Family
ID=55288455
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510833716.4A Pending CN105338192A (en) | 2015-11-25 | 2015-11-25 | Mobile terminal and operation processing method thereof |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN105338192A (en) |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105976843A (en) * | 2016-05-18 | 2016-09-28 | 乐视控股(北京)有限公司 | In-vehicle music control method, device, and automobile |
CN107566643A (en) * | 2017-08-31 | 2018-01-09 | 广东欧珀移动通信有限公司 | Information processing method, device, storage medium and electronic equipment |
CN107608514A (en) * | 2017-09-20 | 2018-01-19 | 维沃移动通信有限公司 | Information processing method and mobile terminal |
CN107688385A (en) * | 2016-08-03 | 2018-02-13 | 北京搜狗科技发展有限公司 | A kind of control method and device |
CN107704919A (en) * | 2017-09-30 | 2018-02-16 | 广东欧珀移动通信有限公司 | Control method, device and the storage medium and mobile terminal of mobile terminal |
CN107908347A (en) * | 2017-11-27 | 2018-04-13 | 上海爱优威软件开发有限公司 | A kind of message method and terminal device based on photo |
CN108108017A (en) * | 2017-12-11 | 2018-06-01 | 维沃移动通信有限公司 | A kind of search information processing method and mobile terminal |
CN108803867A (en) * | 2018-04-12 | 2018-11-13 | 珠海市魅族科技有限公司 | A kind of information processing method and device |
CN109074382A (en) * | 2016-04-12 | 2018-12-21 | 皇家飞利浦有限公司 | Data base querying creation |
CN113282364A (en) * | 2021-06-07 | 2021-08-20 | 维沃移动通信(杭州)有限公司 | Display method, display device and electronic equipment |
CN113342223A (en) * | 2021-05-28 | 2021-09-03 | 维沃移动通信(杭州)有限公司 | Unread message identifier display method and device and electronic equipment |
CN113836973A (en) * | 2020-06-23 | 2021-12-24 | 中兴通讯股份有限公司 | Terminal control method, device, terminal and storage medium |
CN113994659A (en) * | 2019-06-17 | 2022-01-28 | 佳能株式会社 | Electronic device and control method thereof |
WO2023222130A1 (en) * | 2022-05-20 | 2023-11-23 | 荣耀终端有限公司 | Display method and electronic device |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110175932A1 (en) * | 2010-01-21 | 2011-07-21 | Tobii Technology Ab | Eye tracker based contextual action |
CN103605422A (en) * | 2013-10-23 | 2014-02-26 | 苏州安可信通信技术有限公司 | Touch screen operation method and device |
CN103699210A (en) * | 2012-09-27 | 2014-04-02 | 北京三星通信技术研究有限公司 | Mobile terminal and control method thereof |
CN103885589A (en) * | 2014-03-06 | 2014-06-25 | 华为技术有限公司 | Eye movement tracking method and device |
CN104348938A (en) * | 2013-07-23 | 2015-02-11 | 深圳市赛格导航科技股份有限公司 | Vehicle-mounted terminal eyeball identification automatic dialing system and method |
US20150095844A1 (en) * | 2013-09-30 | 2015-04-02 | Lg Electronics Inc. | Method of recognizing multi-gaze and apparatus therefor |
-
2015
- 2015-11-25 CN CN201510833716.4A patent/CN105338192A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110175932A1 (en) * | 2010-01-21 | 2011-07-21 | Tobii Technology Ab | Eye tracker based contextual action |
CN103699210A (en) * | 2012-09-27 | 2014-04-02 | 北京三星通信技术研究有限公司 | Mobile terminal and control method thereof |
CN104348938A (en) * | 2013-07-23 | 2015-02-11 | 深圳市赛格导航科技股份有限公司 | Vehicle-mounted terminal eyeball identification automatic dialing system and method |
US20150095844A1 (en) * | 2013-09-30 | 2015-04-02 | Lg Electronics Inc. | Method of recognizing multi-gaze and apparatus therefor |
CN103605422A (en) * | 2013-10-23 | 2014-02-26 | 苏州安可信通信技术有限公司 | Touch screen operation method and device |
CN103885589A (en) * | 2014-03-06 | 2014-06-25 | 华为技术有限公司 | Eye movement tracking method and device |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109074382A (en) * | 2016-04-12 | 2018-12-21 | 皇家飞利浦有限公司 | Data base querying creation |
CN105976843A (en) * | 2016-05-18 | 2016-09-28 | 乐视控股(北京)有限公司 | In-vehicle music control method, device, and automobile |
CN107688385A (en) * | 2016-08-03 | 2018-02-13 | 北京搜狗科技发展有限公司 | A kind of control method and device |
CN107566643A (en) * | 2017-08-31 | 2018-01-09 | 广东欧珀移动通信有限公司 | Information processing method, device, storage medium and electronic equipment |
CN107608514A (en) * | 2017-09-20 | 2018-01-19 | 维沃移动通信有限公司 | Information processing method and mobile terminal |
CN107704919A (en) * | 2017-09-30 | 2018-02-16 | 广东欧珀移动通信有限公司 | Control method, device and the storage medium and mobile terminal of mobile terminal |
CN107704919B (en) * | 2017-09-30 | 2021-12-07 | Oppo广东移动通信有限公司 | Control method and device of mobile terminal, storage medium and mobile terminal |
CN107908347A (en) * | 2017-11-27 | 2018-04-13 | 上海爱优威软件开发有限公司 | A kind of message method and terminal device based on photo |
CN108108017A (en) * | 2017-12-11 | 2018-06-01 | 维沃移动通信有限公司 | A kind of search information processing method and mobile terminal |
CN108803867A (en) * | 2018-04-12 | 2018-11-13 | 珠海市魅族科技有限公司 | A kind of information processing method and device |
CN113994659A (en) * | 2019-06-17 | 2022-01-28 | 佳能株式会社 | Electronic device and control method thereof |
CN113994659B (en) * | 2019-06-17 | 2023-09-26 | 佳能株式会社 | Electronic device, control method therefor, program, and storage medium |
US11910081B2 (en) | 2019-06-17 | 2024-02-20 | Canon Kabushiki Kaisha | Electronic apparatus, method for controlling the same, and storage medium |
CN113836973A (en) * | 2020-06-23 | 2021-12-24 | 中兴通讯股份有限公司 | Terminal control method, device, terminal and storage medium |
CN113342223A (en) * | 2021-05-28 | 2021-09-03 | 维沃移动通信(杭州)有限公司 | Unread message identifier display method and device and electronic equipment |
CN113282364A (en) * | 2021-06-07 | 2021-08-20 | 维沃移动通信(杭州)有限公司 | Display method, display device and electronic equipment |
CN113282364B (en) * | 2021-06-07 | 2023-10-10 | 维沃移动通信(杭州)有限公司 | Display method, display device and electronic equipment |
WO2023222130A1 (en) * | 2022-05-20 | 2023-11-23 | 荣耀终端有限公司 | Display method and electronic device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105338192A (en) | Mobile terminal and operation processing method thereof | |
CN104750420A (en) | Screen capturing method and device | |
CN105302904A (en) | Information pushing method and apparatus | |
CN104808944A (en) | Touch operation induction method and device | |
CN105100401A (en) | Method and device for implementing return to original interface after interface switching | |
CN104811532A (en) | Terminal screen display parameter adjustment method and device | |
CN105159594A (en) | Touch photographing device and method based on pressure sensor, and mobile terminal | |
CN104731480A (en) | Image display method and device based on touch screen | |
CN105389110A (en) | Fast touch apparatus and method | |
CN105162960A (en) | Photographing device and method of frameless mobile terminal | |
CN104731340A (en) | Cursor position determining method and terminal device | |
CN105045502A (en) | Image processing method and image processing device | |
CN104767889A (en) | Screen state control method and device | |
CN104915099A (en) | Icon sorting method and terminal equipment | |
CN105100603A (en) | Photographing triggering device embedded in intelligent terminal and method of triggering photographing device | |
CN104915139A (en) | Automatic adjusting method and device for brightness of display interface | |
CN105681582A (en) | Control color adjusting method and terminal | |
CN105243126A (en) | Cross-screen screen capture method and apparatus | |
CN104935044A (en) | Charging method and charging device | |
CN104916270A (en) | Method for adjusting screen brightness and device thereof | |
CN105263226A (en) | Method for controlling lighting device and mobile terminal | |
CN104902098A (en) | Method and system for switching sidebar of mobile terminal | |
CN105786384A (en) | Device and method for adjusting focal point | |
CN105511856A (en) | Device and method for checking messages | |
CN104853088A (en) | Method for rapidly focusing a photographing mobile terminal and mobile terminal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20160217 |