CN114567696A - Application control method, application control device and storage medium - Google Patents

Application control method, application control device and storage medium Download PDF

Info

Publication number
CN114567696A
CN114567696A CN202210085806.XA CN202210085806A CN114567696A CN 114567696 A CN114567696 A CN 114567696A CN 202210085806 A CN202210085806 A CN 202210085806A CN 114567696 A CN114567696 A CN 114567696A
Authority
CN
China
Prior art keywords
screen
detected
touch event
terminal
screen touch
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210085806.XA
Other languages
Chinese (zh)
Other versions
CN114567696B (en
Inventor
范永康
刘苏月
禹星
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN202210085806.XA priority Critical patent/CN114567696B/en
Publication of CN114567696A publication Critical patent/CN114567696A/en
Application granted granted Critical
Publication of CN114567696B publication Critical patent/CN114567696B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Environmental & Geological Engineering (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

The present disclosure relates to an application control method, an application control apparatus, and a storage medium. The application control method is applied to a terminal and comprises the following steps: when the terminal is in a screen-off state, detecting a screen touch event; in response to determining that a screen touch event is detected, starting a preset application based on the screen touch event. Through the method and the device, the preset application can be started in the screen-off state, and the user experience is improved.

Description

Application control method, application control device and storage medium
Technical Field
The present disclosure relates to the field of terminal technologies, and in particular, to an application control method, an application control apparatus, and a storage medium.
Background
With the development and popularization of intelligent terminals, users are increasingly used to use various functions provided by the terminals anytime and anywhere. Generally, a user needs to unlock the terminal, search for an application corresponding to a function to be used, open the application, and start the function to be used. However, in real life, a user needs to quickly start some functions by the terminal, for example, when an accident occurs, a user needs to quickly pick up a picture to record the current situation, and the like, and in these cases, the user starts the functions to be used by adopting the above procedure, and often misses the best use opportunity.
Currently, with the progress of the intelligent terminal technology, a variety of shortcut gestures are created and put into use for opening the application function, for example, the application is used by tapping with a finger joint. However, at present, the application is triggered to be started through a shortcut gesture, and the method is only suitable for a screen-on state, and cannot start the application through screen operation in a screen-off state.
Disclosure of Invention
To overcome the problems in the related art, the present disclosure provides an application control method, an application control apparatus, and a storage medium.
According to a first aspect of the embodiments of the present disclosure, there is provided an application control method, applied to a terminal, including:
when the terminal is in a screen-off state, detecting a screen touch event; in response to determining that a screen touch event is detected, starting a preset application based on the screen touch event.
In one embodiment, the determining that a screen touch event is detected includes:
when the terminal is in a screen-off state, a first screen touch event triggering the terminal to light a screen is determined to be detected, and a second screen touch event is detected after the screen is lighted.
In one embodiment, the determining that a first screen touch event triggering the terminal to light a screen is detected when the terminal is in a screen-off state and a second screen touch event is detected after the screen is lit comprises:
if at least two finger joint tapping events are detected when the terminal is in a screen-off state, determining that a first screen touch event triggering the terminal to light up a screen is detected; determining that a second screen touch event is detected if a screen touch event is detected within a first time threshold after the first screen touch event is detected and the detected touch event is identified as a knuckle tap event.
In one embodiment, the detecting at least two finger joint tap events while the terminal is in the off-screen state includes:
recording the time difference and the pixel position difference of two continuous screen touch events detected when the terminal is in a screen-off state, and determining the touch force of each detected screen touch event; and if the time difference is smaller than a second time threshold, the pixel position difference is smaller than a pixel threshold, and the touch force is larger than a force threshold, determining that two continuous finger joint tapping events are detected.
In one embodiment, the application control method further includes:
lightening a screen and controlling the terminal to enter a screen locking state after the first screen touch event is detected; if the screen touch event is not detected within a first time threshold value after the first screen touch event is detected, maintaining the screen locking state, and continuously detecting the screen touch event; if a screen touch event is detected within a third time threshold, maintaining the screen locking state and identifying whether the detected touch event is a finger joint tapping event; and if the screen touch event is not detected within the third time threshold, controlling the terminal to enter a screen-off state.
In one embodiment, the determining that a screen touch event is detected includes:
recording the time difference and the pixel position difference of two continuous screen touch events detected when the terminal is in a screen-off state; if the time difference is smaller than a second time threshold, the pixel position difference is smaller than a pixel threshold, and the two continuous screen touch events are recognized to be finger joint tapping events based on the first model, determining that the screen touch event is detected; the input of the first model is a capacitance change value generated by a screen touch event, and the output of the first model is confidence coefficient for representing that the screen touch event is a finger joint tapping event.
In one embodiment, the confidence threshold for the first model to identify a finger joint tap event is less than the confidence threshold for the second model to identify a finger joint tap event; the second model is used for recognizing the finger joint tapping event in the terminal screen-on state.
According to a second aspect of the embodiments of the present disclosure, there is provided an application control apparatus, an application terminal, including:
the terminal comprises a detection unit, a display unit and a control unit, wherein the detection unit is used for detecting a screen touch event when the terminal is in a screen-off state; and the processing unit is used for starting a preset application based on the screen touch event under the condition that the detection unit determines that the screen touch event is detected.
In one embodiment, the detection unit determines that a screen touch event is detected by:
determining that a first screen touch event triggering the terminal to light a screen is detected when the terminal is in a screen-off state, and detecting a second screen touch event after the screen is lighted.
In one embodiment, the detection unit determines that a first screen touch event triggering the terminal to light the screen is detected when the terminal is in the screen-off state, and detects a second screen touch event after the screen is lit:
if at least two finger joint tapping events are detected when the terminal is in a screen-off state, determining that a first screen touch event triggering the terminal to light up a screen is detected; determining that a second screen touch event is detected if a screen touch event is detected within a first time threshold after the first screen touch event is detected and the detected touch event is identified as a knuckle tap event.
In one embodiment, the detection unit detects at least two finger joint tapping events when the terminal is in the screen-off state in the following manner:
recording the time difference and the pixel position difference of two continuous screen touch events detected when the terminal is in a screen-off state, and determining the touch force of each detected screen touch event; and if the time difference is smaller than a second time threshold, the pixel position difference is smaller than a pixel threshold, and the touch force is larger than a force threshold, determining that two continuous finger joint tapping events are detected.
In one embodiment, the processing unit is further configured to:
after the detection unit detects the first screen touch event, lightening a screen and controlling the terminal to enter a screen locking state; if the screen touch event is not detected within a first time threshold value after the first screen touch event is detected by the detection unit, keeping the screen locking state and continuously detecting the screen touch event; if the detection unit detects a screen touch event within a third time threshold, keeping the screen locking state and identifying whether the detected touch event is a finger joint tapping event; and if the detection unit does not detect the screen touch event within the third time threshold, controlling the terminal to enter a screen-off state.
In one embodiment, the detection unit determines that a screen touch event is detected by:
recording the time difference and the pixel position difference of two continuous screen touch events detected when the terminal is in a screen-off state; if the time difference is smaller than a second time threshold, the pixel position difference is smaller than a pixel threshold, and the two continuous screen touch events are recognized to be finger joint tapping events based on the first model, determining that the screen touch event is detected; the input of the first model is a capacitance change value generated by a screen touch event, and the output of the first model is confidence coefficient for representing that the screen touch event is a finger joint tapping event.
In one embodiment, the confidence threshold for the first model to identify a finger joint tap event is less than the confidence threshold for the second model to identify a finger joint tap event; the second model is used for recognizing the finger joint tapping event in the terminal screen-on state.
According to a third aspect of the embodiments of the present disclosure, there is provided an application control apparatus including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to: performing the method of the first aspect or any one of the embodiments of the first aspect.
According to a fourth aspect of embodiments of the present disclosure, there is provided a computer-readable storage medium having instructions stored therein, which when executed by a processor of a terminal, enable the terminal to perform the method of the first aspect or any one of the implementation manners of the first aspect.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects: when the terminal is in the screen-off state, the control of application starting is carried out based on the detection of the screen touch event, so that the application starting based on the screen touch event in the screen-off state can be realized, the time and the steps for awakening the application in the screen-off state of the terminal are further shortened, and the user experience is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1 is a flow chart illustrating an application control method according to an exemplary embodiment.
FIG. 2 is a flow chart illustrating an application control method according to an exemplary embodiment.
Fig. 3 is an operational diagram illustrating the quick launch of an application in a screen-off state triggered by three finger tap events according to an exemplary embodiment of the present disclosure.
FIG. 4 is a flowchart illustrating an application control method according to an exemplary embodiment.
Fig. 5 is a schematic flowchart illustrating a process of starting a camera application in a mobile phone off-screen state according to an exemplary embodiment of the disclosure.
Fig. 6 is an operational diagram illustrating the quick launch of an application in a screen-off state triggered by successive finger tap events, according to an exemplary embodiment of the present disclosure.
Fig. 7 is a schematic flowchart illustrating a process of starting a camera application in a mobile phone screen-off state according to an exemplary embodiment of the disclosure.
FIG. 8 is a block diagram illustrating an application control device according to an exemplary embodiment.
FIG. 9 is a block diagram illustrating an apparatus for application control in accordance with an exemplary embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. The following description refers to the accompanying drawings in which the same numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure.
The application control method provided by the disclosure is applied to a scene of starting the application in the terminal. In the related technology, a scheme of rapidly starting an application through a rapid gesture in a terminal bright screen state is supported. For example, the opening speed and the user experience of the application can be improved by tapping the finger joints twice, calling a screen capture shortcut gesture, or opening a camera. However, in the screen-off state, if a user needs to open an application, the user needs to trigger the terminal to light the screen first, and when the screen is locked, the user needs to unlock the terminal, and then enters the terminal display interface, and the application that needs to be opened is started on the terminal display interface.
In view of this, the present disclosure provides a method for controlling application start based on detecting a screen touch event when a terminal is in a screen-off state, so as to start an application in the screen-off state.
Fig. 1 is a flowchart illustrating an application control method according to an exemplary embodiment, where the application control method is used in a terminal as shown in fig. 1, and includes the following steps.
In step S11, a screen touch event is detected while the terminal is in the off-screen state.
In step S12, in response to determining that the screen touch event is detected, the preset application is started based on the screen touch event.
In the present disclosure, when the terminal is in the screen-off state, if a screen touch event is detected, the preset application is started based on the screen touch event, so that the preset application is started in the screen-off state.
The preset application referred to in the present disclosure may be an application that is preset and has an association relationship with the touch event. Upon detecting a screen touch event, an application associated with the detected screen touch event is launched.
In order to avoid the misoperation of starting the application in the screen-off state, the condition for determining the detection of the screen touch event can be improved.
In practical applications, if a user needs to open an application, the user usually operates the application in a screen lighting state. Therefore, in an embodiment of the present disclosure, the screen may be lit up after the screen touch event is detected in the screen-off state, and if the screen touch event is continuously detected after the screen is lit up, it may be determined that the screen touch event for triggering the starting of the preset application is detected.
For convenience of description in this disclosure, a screen touch event detected in a screen-off state and used to trigger a terminal to light a screen is referred to as a first screen touch event, and a screen touch event continuously detected in a screen-on state and used to trigger starting of a preset application is referred to as a second screen touch event.
Fig. 2 is a flowchart illustrating a method of determining that a screen touch event is detected, according to an exemplary embodiment, and as shown in fig. 2, the application control method is applied to a terminal, including the following steps.
In step S21, a first screen touch event that triggers the terminal to light the screen is detected while the terminal is in the off-screen state.
In step S22, if it is determined that the first screen touch event triggering the terminal to light the screen is detected while the terminal is in the off-screen state, the screen is lit, and a second screen touch event is detected.
In step S23, if a second screen touch event is detected after the screen is lit, it is determined that the screen touch event for triggering the start of the preset application is detected.
In order to further improve the accuracy of starting the application based on the screen touch event in the screen-off state, the screen touch event may be set as a finger joint tapping event based on the finger joint tapping event detected in the screen-off state.
In the present disclosure, a screen touch event that triggers the terminal to light the screen in the screen-off state may be set as a finger joint tap event that continues for at least two times. For example, two consecutive finger tap events trigger the screen to light up. In other words, in the present disclosure, if a finger joint tap event is detected at least twice while the terminal is in the off-screen state, it is determined that a first screen touch event that triggers the terminal to light up the screen is detected.
Further, in the present disclosure, a second screen touch event detected after the screen is lit may also be set as a finger joint tapping event, so as to implement that the user taps through a continuous finger joint, thereby implementing quick start of the application. In one example, the present disclosure determines that a second screen touch event is detected if a screen touch event is detected within a first time threshold after detecting a first screen touch event and the detected touch event is identified as a finger joint tap event.
In the present disclosure, in order to avoid the occurrence of a false touch, it may be determined that at least two finger joint tap events are detected in the screen-off state. In one example, in the terminal screen-off state, the application may be triggered to start by three finger joint tapping events.
Based on the particularity of the finger joint of the human body, in the screen-lighting state, the related technology supports that whether the user uses ordinary finger touch or finger joint tapping is judged according to capacitance data corresponding to a touch event detected by a screen sensor. However, in the screen-off state, the screen sensor is in a low power consumption state, the accuracy of the sensor is low, and if the sensor of the terminal does not support the acquisition of the capacitance value, the finger joint tapping event in the screen-off state cannot be identified through the algorithm model. Thus, in the off-screen state, the user cannot quickly invoke a scheme of various applications using a double-tap, such as in the bright-screen state.
In view of this, the present disclosure provides a method for quickly starting an application by tapping a finger joint when a capacitance value cannot be obtained in a current screen-off state. The following description of the embodiments of the present disclosure describes an implementation process for recognizing a finger joint tapping event in a screen-off state.
In one embodiment of the present disclosure, when a screen touch event is detected, a touch force of the detected screen touch event may be determined. Whether the event is a finger joint tap event is determined based on the touch strength of the screen touch event. And if the touch force is greater than the force threshold, determining that the detected touch event is a finger joint tapping event.
In one embodiment, in the embodiment of the present disclosure, the time difference and the position difference between two consecutive touch events may be calculated based on the detected position, time, and force of each screen touch event. Based on the calculated time difference and position difference, continuity of the two touch events is determined. And if the time difference between the two touch events is smaller than a second time threshold and the position difference between the two touch events is smaller than a position threshold, determining that the two continuous touch events are continuous. Wherein the second time threshold may be set to 0.5 seconds, and the position threshold may be set to 20 pixels.
In one embodiment of the disclosure, a time difference and a pixel position difference of two consecutive screen touch events detected when the terminal is in a screen-off state are recorded, and a touch strength of each detected screen touch event is determined. And if the time difference of two continuous screen touch events is smaller than a second time threshold, the pixel position difference is smaller than a pixel threshold, and the touch force of the two continuous screen touch events is larger than a force threshold, determining that two continuous finger joint tapping events are detected.
In the disclosure, after detecting two consecutive finger joint tapping events, determining that a first screen touch event triggering the terminal to light the screen is detected, lighting the screen, and controlling the terminal to enter a screen lighting state. And if the screen touch event is detected within a first time threshold after the first screen touch event is detected, identifying whether the touch event is a finger joint tapping event or not by adopting a second model identification mode. If it is determined that the touch event is a joint tap event, the application is opened.
Fig. 3 is an operational diagram illustrating the quick launch of an application in a screen-off state triggered by three finger tap events according to an exemplary embodiment of the present disclosure. Referring to fig. 3, in the screen off state, the user performs two consecutive finger joint taps, and after the terminal detects two finger joint tap events, the terminal screen is lighted up, and at this time, the terminal is in the screen locking state. And the user continues to tap the finger joint, and the terminal detects the finger joint tapping event and starts the application.
In the disclosure, if the terminal detects at least two continuous finger joint tapping events and lights up the screen, it may be determined whether a screen touch event is detected, and if a screen touch event for triggering and starting is not detected, the terminal may be controlled to enter a screen-off state.
Fig. 4 is a flowchart illustrating an application control method according to an exemplary embodiment, where the application control method is used in a terminal as shown in fig. 4, and includes the following steps.
In step S31, the screen is lit and the terminal is controlled to enter a lock screen state after the first screen touch event is detected.
In step S32, if the screen touch event is not detected within a first time threshold after the first screen touch event is detected, the lock screen state is maintained, and the screen touch event continues to be detected.
In step S33, a screen touch event is detected within the third time threshold, the lock screen state is maintained and it is recognized whether the detected touch event is a finger joint tap event.
In step S34, if the screen touch event is not detected within the third time threshold, the control terminal enters a screen-off state.
In the present disclosure, based on the detection of the first screen touch event, if the screen touch event is detected within the first time threshold, the screen lock state is maintained, and the screen touch event continues to be detected. And if the screen touch event is detected within the third time threshold, maintaining the screen locking state and identifying whether the detected touch event is a finger joint tapping event. And if the screen touch event is not detected within the third time threshold, controlling the terminal to enter a screen-off state.
In the embodiment of the present disclosure, the terminal is a mobile phone, and the started application is a camera application, the first time threshold and the second time threshold are set to 0.5 second, the position threshold is set to 20 pixels, and the third time threshold is 10 seconds. Fig. 5 is a schematic flowchart illustrating a process of starting a camera application in a mobile phone screen-off state according to an exemplary embodiment of the disclosure. When the mobile phone is in a screen-off state, the time of each screen touch event is recorded, and the touch position is recorded. It is determined whether the time difference between two consecutive screen touch events is less than 0.5 second and it is determined whether the position difference is less than 20 pixels. And if the time difference of two continuous screen touch events is less than 0.5 second and the position difference is determined to be less than 20 pixels, lighting the screen and enabling the mobile phone to be in a screen locking state. If the time difference between two consecutive screen touch events is greater than or equal to 0.5 second and/or the position difference is greater than or equal to 20 pixels, no operation is performed. After the mobile phone lights the screen, whether a new screen touch event exists within 0.5 second or not is detected, and whether the position difference between the new screen touch event and the touch position of the existing latest touch event is less than 20 pixels or not is detected. If a new screen touch event is detected within 0.5 second and the position difference is less than 20 pixels, identifying whether the current screen touch event is a finger tap event through the finger tap event recognition model in the bright screen state. And if the finger tapping event is detected, starting the shortcut to open the camera. And if no new screen touch event is detected within 0.5 second or the position difference between the detected new screen touch event and the touch position of the existing latest touch event is greater than or equal to 20 pixels, the mobile phone keeps the screen locking state. Or if a new screen touch event is detected within 0.5 seconds and the position difference is less than 20 pixels, but the screen touch event is not a finger tap event, the lock screen state can be maintained. And the mobile phone continues to detect whether a new screen touch event exists in the screen locking state. And if a new screen touch event is detected within 10 seconds, keeping the screen locking state. If no new screen touch event is detected within 10 seconds, the screen-off state may be entered.
In another embodiment of the present disclosure, if a sensor supporting acquisition of capacitance data in a screen-off state is installed on a terminal, the present disclosure may provide an embodiment of determining whether a user uses a common finger touch or a finger joint tapping according to capacitance data corresponding to a touch event detected by a screen sensor in the screen-off state. In the screen-off state in the present disclosure, since the screen sampling rate changes at this time, the numerical accuracy becomes low. Therefore, the neural network model for recognizing a finger joint tap event in the screen-off state, hereinafter referred to as the first model, is retrained in the present disclosure. The input to the first model is a capacitance change value resulting from a screen touch event and the output of the first model is a confidence level characterizing the screen touch event as a finger joint tap event.
In consideration of the fact that fewer touch events need to be processed in the screen-off state, a relatively large neural network, such as Alex net, may be adopted in the present disclosure, and the confidence threshold for determining that a touch event is a finger joint tapping event is appropriately reduced compared to a model for identifying a finger joint event in the screen-on state. For example, a model that recognizes a knuckle event in a bright screen state is referred to as a second model. The confidence threshold for the first model to identify a finger joint tap event is less than the confidence threshold for the second model to identify a finger joint tap event. For example, the confidence threshold for the second model to identify a finger joint tap event is 0.5 and the confidence threshold for the first model to identify a finger joint tap event is 0.4.
In the present disclosure, in the screen-off state, after the touch event causes the screen capacitance to change, the screen capacitance value is collected, and the collected screen capacitance value is input to the first model as the input data, and the confidence that the touch event is the finger joint tapping event is obtained. And if the confidence coefficient of the recognized finger joint tapping event is greater than the confidence coefficient threshold value of 0.4, determining that the detected touch event is the finger joint tapping event, otherwise, determining that the detected touch event is a common touch event.
In the present disclosure, in order to avoid the occurrence of a false touch, it may be determined that at least two finger joint tap events are detected in the screen-off state.
Fig. 6 is an operational diagram illustrating the quick launch of an application in a screen-off state triggered by successive finger tap events, according to an exemplary embodiment of the present disclosure. Referring to fig. 6, in the screen-off state, the user performs two consecutive finger joint taps, and the terminal starts the application after detecting two finger joint tap events.
In one embodiment, in the embodiment of the present disclosure, a time difference and a position difference between two consecutive touch events may be calculated based on the detected position and time of each screen touch event. Based on the calculated time difference and position difference, continuity of the two touch events is determined. And if the time difference between the two touch events is smaller than a second time threshold and the position difference between the two touch events is smaller than a position threshold, determining that the two continuous touch events are continuous.
In one embodiment of the disclosure, a time difference and a pixel position difference of two consecutive screen touch events detected when the terminal is in a screen-off state are recorded. If the time difference between two continuous screen touch events is smaller than a second time threshold value, the pixel position difference is smaller than a pixel threshold value, and the screen touch event is identified to be a finger joint tapping event through the first model, it is determined that the finger joint tapping event is detected twice continuously. Wherein the first time threshold is set to 0.5 seconds and the position threshold is set to 20 pixels.
In the embodiment of the present disclosure, a terminal is a mobile phone, and a started application is a camera application, a first time threshold and a second time threshold are set to 0.5 second, and a position threshold is set to 20 pixels. Fig. 7 is a schematic flowchart illustrating a process of starting a camera application in a mobile phone screen-off state according to an exemplary embodiment of the disclosure. Referring to fig. 7, in the embodiment of the present disclosure, the mobile phone may record the time and the position of each screen touch event. When the screen touch event is detected, whether the screen touch event is a finger joint tapping event is judged, and if the screen touch event is the finger joint tapping event, whether the time difference between the second screen touch event and the last screen touch event is less than 0.5 second and whether the position difference is less than 20 pixels can be calculated based on the position and time of each detected screen touch event. If the time difference between the two touch events is greater than or equal to 0.5 second and/or the position difference is greater than or equal to 20 pixels, no operation is performed. And if the time difference of the two touch events is less than 0.5 second and the position difference of the two touch events is less than 20 pixels, judging whether the second screen touch event is a finger joint tapping event or not according to the first model and the corresponding confidence coefficient threshold value thereof. If the event is tapped for a finger joint, a shortcut may be started and the camera application opened. If the finger joint is not the knocking event, no operation is performed.
According to the application control method provided by the embodiment of the disclosure, the quick start of the application can be realized by tapping the event through the finger joint in the screen-off state, and the user experience is improved.
It should be noted that, as can be understood by those skilled in the art, the various embodiments/examples related to the embodiments of the present disclosure may be used in combination with the foregoing embodiments, or may be used independently. Whether used alone or in conjunction with the foregoing embodiments, implement principles similar thereto. In the implementation of the disclosure, some examples are described in the implementation mode used together; of course, those skilled in the art will appreciate that such illustration is not a limitation of the disclosed embodiments.
Based on the same conception, the embodiment of the disclosure also provides an application control device.
It is understood that, in order to implement the above functions, the application control device provided in the embodiments of the present disclosure includes a hardware structure and/or a software module corresponding to each function. The disclosed embodiments can be implemented in hardware or a combination of hardware and computer software, in combination with the exemplary elements and algorithm steps disclosed in the disclosed embodiments. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
FIG. 8 is a block diagram illustrating an application control device according to an exemplary embodiment. Referring to fig. 8, the apparatus 100 may be provided as the terminal according to the above embodiment, and includes a detection unit 101 and a processing unit 102.
The detecting unit 101 is configured to detect a screen touch event when the terminal is in a screen-off state. And the processing unit 102 is configured to determine that a screen touch event is detected in the detection unit, and start a preset application based on the screen touch event.
In one embodiment, the detection unit 101 determines that a screen touch event is detected in the following manner:
determining that a first screen touch event triggering the terminal to light a screen is detected when the terminal is in a screen-off state, and detecting a second screen touch event after the screen is lighted.
In one embodiment, the detecting unit 101 determines that a first screen touch event triggering the terminal to light the screen is detected when the terminal is in the screen-off state, and detects a second screen touch event after the screen is lit up, by:
if at least two finger joint tapping events are detected when the terminal is in a screen-off state, determining that a first screen touch event triggering the terminal to light up a screen is detected; if a screen touch event is detected within a first time threshold after the first screen touch event is detected and the detected touch event is identified as a finger joint tap event, then it is determined that a second screen touch event is detected.
In one embodiment, the detection unit 101 detects at least two finger joint tap events when the terminal is in the off-screen state as follows:
recording the time difference and the pixel position difference of two continuous screen touch events detected when the terminal is in a screen-off state, and determining the touch force of each detected screen touch event; and if the time difference is smaller than a second time threshold, the pixel position difference is smaller than a pixel threshold, and the touch force is larger than a force threshold, determining that the finger joint tapping events which are continuously twice are detected.
In one embodiment, the processing unit 102 is further configured to:
the method comprises the steps of lighting up a screen and controlling the terminal to enter a screen locking state after a first screen touch event is detected by a detection unit 101; if the detection unit 101 does not detect the screen touch event within a first time threshold after detecting the first screen touch event, the screen locking state is maintained, and the screen touch event continues to be detected; if the detection unit 101 detects a screen touch event within a third time threshold, maintaining the screen locking state and identifying whether the detected touch event is a finger joint tapping event; if the detection unit 101 does not detect the screen touch event within the third time threshold, the control terminal enters a screen-off state.
In one embodiment, the detection unit 101 determines that a screen touch event is detected in the following manner:
recording the time difference and the pixel position difference of two continuous screen touch events detected when the terminal is in a screen-off state; if the time difference is smaller than a second time threshold value, the pixel position difference is smaller than a pixel threshold value, and two continuous screen touch events are recognized to be finger joint tapping events based on the first model, determining that the screen touch events are detected; the input of the first model is a capacitance change value generated by the screen touch event, and the output of the first model is a confidence level for characterizing the screen touch event as a knuckle tap event.
In one embodiment, the confidence threshold for the first model to identify a finger joint tap event is less than the confidence threshold for the second model to identify a finger joint tap event; the second model is used for recognizing the finger joint tapping event in the terminal screen-on state.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
FIG. 9 is a block diagram illustrating an apparatus for application control in accordance with an exemplary embodiment. For example, the apparatus 200 may be provided as the terminal according to the above-described embodiment. For example, it may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, etc.
Referring to fig. 9, the apparatus 200 may include one or more of the following components: a processing component 202, a memory 204, a power component 206, a multimedia component 208, an audio component 210, an input/output (I/O) interface 212, a sensor component 214, and a communication component 216.
The processing component 202 generally controls overall operation of the device 200, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing components 202 may include one or more processors 220 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 202 can include one or more modules that facilitate interaction between the processing component 202 and other components. For example, the processing component 202 can include a multimedia module to facilitate interaction between the multimedia component 208 and the processing component 202.
The memory 204 is configured to store various types of data to support operations at the apparatus 200. Examples of such data include instructions for any application or method operating on the device 200, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 204 may be implemented by any type or combination of volatile and non-volatile storage devices, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
Power components 206 provide power to the various components of device 200. Power components 206 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for device 200.
The multimedia component 208 includes a screen that provides an output interface between the device 200 and the user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 208 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the device 200 is in an operating mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 210 is configured to output and/or input audio signals. For example, audio component 210 includes a Microphone (MIC) configured to receive external audio signals when apparatus 200 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 204 or transmitted via the communication component 216. In some embodiments, audio component 210 also includes a speaker for outputting audio signals.
The I/O interface 212 provides an interface between the processing component 202 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor component 214 includes one or more sensors for providing various aspects of status assessment for the device 200. For example, the sensor assembly 214 may detect an open/closed state of the device 200, the relative positioning of components, such as a display and keypad of the device 200, the sensor assembly 214 may also detect a change in the position of the device 200 or a component of the device 200, the presence or absence of user contact with the device 200, the orientation or acceleration/deceleration of the device 200, and a change in the temperature of the device 200. The sensor assembly 214 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 214 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 214 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 216 is configured to facilitate wired or wireless communication between the apparatus 200 and other devices. The device 200 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 216 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 216 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 200 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer readable storage medium comprising instructions, such as memory 204, comprising instructions executable by processor 220 of device 200 to perform the above-described method is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
It is further to be understood that while operations are depicted in the drawings in a particular order, this is not to be understood as requiring that such operations be performed in the particular order shown or in serial order, or that all illustrated operations be performed, to achieve desirable results. In certain environments, multitasking and parallel processing may be advantageous.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the scope of the appended claims.

Claims (16)

1. An application control method is applied to a terminal, and comprises the following steps:
detecting a screen touch event when the terminal is in a screen-off state;
in response to determining that a screen touch event is detected, starting a preset application based on the screen touch event.
2. The application control method of claim 1, wherein the determining that a screen touch event is detected comprises:
when the terminal is in a screen-off state, a first screen touch event triggering the terminal to light a screen is determined to be detected, and a second screen touch event is detected after the screen is lighted.
3. The application control method according to claim 2, wherein the determining that the first screen touch event triggering the terminal to light the screen is detected when the terminal is in the screen-off state and the second screen touch event is detected after the screen is lit up comprises:
if at least two finger joint tapping events are detected when the terminal is in a screen-off state, determining that a first screen touch event triggering the terminal to light up a screen is detected;
determining that a second screen touch event is detected if a screen touch event is detected within a first time threshold after the first screen touch event is detected and the detected touch event is identified as a knuckle tap event.
4. The application control method according to claim 3, wherein the detecting at least two finger joint tap events while the terminal is in the off-screen state comprises:
recording the time difference and the pixel position difference of two continuous screen touch events detected when the terminal is in a screen-off state, and determining the touch force of each detected screen touch event;
and if the time difference is smaller than a second time threshold, the pixel position difference is smaller than a pixel threshold, and the touch force is larger than a force threshold, determining that two continuous finger joint tapping events are detected.
5. The method according to any one of claims 2 to 4, further comprising:
lightening a screen and controlling the terminal to enter a screen locking state after the first screen touch event is detected;
if the screen touch event is not detected within a first time threshold value after the first screen touch event is detected, maintaining the screen locking state, and continuously detecting the screen touch event;
if a screen touch event is detected within a third time threshold, maintaining the screen locking state and identifying whether the detected touch event is a finger joint tapping event;
and if the screen touch event is not detected within the third time threshold, controlling the terminal to enter a screen-off state.
6. The application control method of claim 1, wherein the determining that a screen touch event is detected comprises:
recording the time difference and the pixel position difference of two continuous screen touch events detected when the terminal is in a screen-off state;
if the time difference is smaller than a second time threshold, the pixel position difference is smaller than a pixel threshold, and the two continuous screen touch events are recognized to be finger joint tapping events based on the first model, determining that the screen touch event is detected;
the input of the first model is a capacitance change value generated by a screen touch event, and the output of the first model is a confidence coefficient for representing that the screen touch event is a knuckle tap event.
7. The application control method according to claim 6, wherein the confidence threshold for the first model to identify a finger joint tap event is less than the confidence threshold for the second model to identify a finger joint tap event;
the second model is used for recognizing the finger joint tapping event in the terminal screen-on state.
8. An application control apparatus, comprising:
the terminal comprises a detection unit, a display unit and a control unit, wherein the detection unit is used for detecting a screen touch event when the terminal is in a screen-off state;
and the processing unit is used for starting a preset application based on the screen touch event under the condition that the detection unit determines that the screen touch event is detected.
9. The application control device according to claim 8, wherein the detection unit determines that a screen touch event is detected by:
determining that a first screen touch event triggering the terminal to light a screen is detected when the terminal is in a screen-off state, and detecting a second screen touch event after the screen is lighted.
10. The application control device according to claim 9, wherein the detecting unit determines that a first screen touch event triggering the terminal to light the screen is detected when the terminal is in the screen-off state, and that a second screen touch event is detected after the screen is lit, in the following manner:
if at least two finger joint tapping events are detected when the terminal is in a screen-off state, determining that a first screen touch event triggering the terminal to light up a screen is detected;
determining that a second screen touch event is detected if a screen touch event is detected within a first time threshold after the first screen touch event is detected and the detected touch event is identified as a knuckle tap event.
11. The application control device according to claim 10, wherein the detection unit determines that at least two finger joint tap events are detected while the terminal is in the off-screen state by:
recording the time difference and the pixel position difference of two continuous screen touch events detected when the terminal is in a screen-off state, and determining the touch force of each detected screen touch event;
and if the time difference is smaller than a second time threshold, the pixel position difference is smaller than a pixel threshold, and the touch force is larger than a force threshold, determining that two continuous finger joint tapping events are detected.
12. The apparatus according to any one of claims 9 to 11, wherein the processing unit is further configured to:
after the detection unit detects the first screen touch event, lightening a screen and controlling the terminal to enter a screen locking state;
if the screen touch event is not detected within a first time threshold value after the first screen touch event is detected by the detection unit, keeping the screen locking state and continuously detecting the screen touch event;
if the detection unit detects a screen touch event within a third time threshold, keeping the screen locking state and identifying whether the detected touch event is a finger joint tapping event;
and if the detection unit does not detect the screen touch event within the third time threshold, controlling the terminal to enter a screen-off state.
13. The application control device according to claim 8, wherein the detection unit determines that a screen touch event is detected by:
recording the time difference and the pixel position difference of two continuous screen touch events detected when the terminal is in a screen-off state;
if the time difference is smaller than a second time threshold, the pixel position difference is smaller than a pixel threshold, and the two continuous screen touch events are recognized to be finger joint tapping events based on the first model, determining that the screen touch event is detected;
the input of the first model is a capacitance change value generated by a screen touch event, and the output of the first model is confidence coefficient for representing that the screen touch event is a finger joint tapping event.
14. The application control device of claim 13, wherein the confidence threshold for the first model to identify a finger joint tap event is less than the confidence threshold for the second model to identify a finger joint tap event;
the second model is used for recognizing the finger joint tapping event in the terminal screen-on state.
15. An application control apparatus, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to: performing the method of any one of claims 1 to 7.
16. A computer-readable storage medium having stored therein instructions, which when executed by a processor of a mobile terminal, enable the mobile terminal to perform the method of any one of claims 1 to 7.
CN202210085806.XA 2022-01-25 2022-01-25 Application control method, application control device and storage medium Active CN114567696B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210085806.XA CN114567696B (en) 2022-01-25 2022-01-25 Application control method, application control device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210085806.XA CN114567696B (en) 2022-01-25 2022-01-25 Application control method, application control device and storage medium

Publications (2)

Publication Number Publication Date
CN114567696A true CN114567696A (en) 2022-05-31
CN114567696B CN114567696B (en) 2024-07-30

Family

ID=81713384

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210085806.XA Active CN114567696B (en) 2022-01-25 2022-01-25 Application control method, application control device and storage medium

Country Status (1)

Country Link
CN (1) CN114567696B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117389454A (en) * 2022-07-01 2024-01-12 荣耀终端有限公司 Finger joint operation identification method and electronic equipment

Citations (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013065289A (en) * 2011-08-29 2013-04-11 Kyocera Corp Device, method, and program
CN103064606A (en) * 2012-12-24 2013-04-24 天津三星光电子有限公司 Screen unlocking method for mobile terminal
WO2013166878A1 (en) * 2012-05-07 2013-11-14 惠州Tcl移动通信有限公司 Touch screen device based application starting method and corresponding touch screen device
WO2014131268A1 (en) * 2013-02-27 2014-09-04 Tcl通讯(宁波)有限公司 Method and system for unlocking screen of mobile phone
CN104238916A (en) * 2014-09-16 2014-12-24 广东欧珀移动通信有限公司 Application or application function starting method of mobile terminal, and mobile terminal
CN104991789A (en) * 2015-06-02 2015-10-21 小米科技有限责任公司 Method and apparatus for starting application
CN105094565A (en) * 2015-08-14 2015-11-25 广东欧珀移动通信有限公司 Method for quickly checking time and mobile terminal
CN105549781A (en) * 2015-12-11 2016-05-04 魅族科技(中国)有限公司 Application starting method and terminal device
CN105867818A (en) * 2016-03-30 2016-08-17 乐视控股(北京)有限公司 Terminal interaction control device
CN106569717A (en) * 2016-11-03 2017-04-19 努比亚技术有限公司 Mobile terminal and application starting method
CN106775364A (en) * 2015-11-20 2017-05-31 北京奇虎科技有限公司 Method and terminal that a kind of camera quick starts
CN106843672A (en) * 2016-12-20 2017-06-13 努比亚技术有限公司 A kind of terminal screen locking operation device and method
CN107977158A (en) * 2017-12-28 2018-05-01 Tcl移动通信科技(宁波)有限公司 Mobile terminal and touch-screen double-click processing method and storage medium
US20180367656A1 (en) * 2017-06-15 2018-12-20 Lg Electronics Inc. Mobile terminal and method for controlling the same
CN109814799A (en) * 2019-01-25 2019-05-28 维沃移动通信有限公司 Screen response control mehtod and terminal device
CN110554815A (en) * 2018-05-30 2019-12-10 北京小米移动软件有限公司 Icon awakening method, electronic device and storage medium
CN111258659A (en) * 2018-11-30 2020-06-09 北京小米移动软件有限公司 Screen-off control method and device and storage medium
CN111475792A (en) * 2020-04-15 2020-07-31 广州三星通信技术研究有限公司 Method and device for quickly accessing lock screen application
CN112445410A (en) * 2020-12-07 2021-03-05 北京小米移动软件有限公司 Touch event identification method and device and computer readable storage medium
WO2021068097A1 (en) * 2019-10-08 2021-04-15 深圳市欢太科技有限公司 Black screen gesture processing method and apparatus, electronic device, and storage medium
WO2021218761A1 (en) * 2020-04-29 2021-11-04 维沃移动通信有限公司 Information display method and electronic device
WO2021233284A1 (en) * 2020-05-20 2021-11-25 维沃移动通信有限公司 Detection method and device, and electronic apparatus

Patent Citations (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013065289A (en) * 2011-08-29 2013-04-11 Kyocera Corp Device, method, and program
WO2013166878A1 (en) * 2012-05-07 2013-11-14 惠州Tcl移动通信有限公司 Touch screen device based application starting method and corresponding touch screen device
CN103064606A (en) * 2012-12-24 2013-04-24 天津三星光电子有限公司 Screen unlocking method for mobile terminal
WO2014131268A1 (en) * 2013-02-27 2014-09-04 Tcl通讯(宁波)有限公司 Method and system for unlocking screen of mobile phone
CN104238916A (en) * 2014-09-16 2014-12-24 广东欧珀移动通信有限公司 Application or application function starting method of mobile terminal, and mobile terminal
CN104991789A (en) * 2015-06-02 2015-10-21 小米科技有限责任公司 Method and apparatus for starting application
CN105094565A (en) * 2015-08-14 2015-11-25 广东欧珀移动通信有限公司 Method for quickly checking time and mobile terminal
CN106775364A (en) * 2015-11-20 2017-05-31 北京奇虎科技有限公司 Method and terminal that a kind of camera quick starts
CN105549781A (en) * 2015-12-11 2016-05-04 魅族科技(中国)有限公司 Application starting method and terminal device
CN105867818A (en) * 2016-03-30 2016-08-17 乐视控股(北京)有限公司 Terminal interaction control device
CN106569717A (en) * 2016-11-03 2017-04-19 努比亚技术有限公司 Mobile terminal and application starting method
CN106843672A (en) * 2016-12-20 2017-06-13 努比亚技术有限公司 A kind of terminal screen locking operation device and method
US20180367656A1 (en) * 2017-06-15 2018-12-20 Lg Electronics Inc. Mobile terminal and method for controlling the same
CN107977158A (en) * 2017-12-28 2018-05-01 Tcl移动通信科技(宁波)有限公司 Mobile terminal and touch-screen double-click processing method and storage medium
CN110554815A (en) * 2018-05-30 2019-12-10 北京小米移动软件有限公司 Icon awakening method, electronic device and storage medium
CN111258659A (en) * 2018-11-30 2020-06-09 北京小米移动软件有限公司 Screen-off control method and device and storage medium
CN109814799A (en) * 2019-01-25 2019-05-28 维沃移动通信有限公司 Screen response control mehtod and terminal device
WO2021068097A1 (en) * 2019-10-08 2021-04-15 深圳市欢太科技有限公司 Black screen gesture processing method and apparatus, electronic device, and storage medium
CN111475792A (en) * 2020-04-15 2020-07-31 广州三星通信技术研究有限公司 Method and device for quickly accessing lock screen application
WO2021218761A1 (en) * 2020-04-29 2021-11-04 维沃移动通信有限公司 Information display method and electronic device
WO2021233284A1 (en) * 2020-05-20 2021-11-25 维沃移动通信有限公司 Detection method and device, and electronic apparatus
CN112445410A (en) * 2020-12-07 2021-03-05 北京小米移动软件有限公司 Touch event identification method and device and computer readable storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117389454A (en) * 2022-07-01 2024-01-12 荣耀终端有限公司 Finger joint operation identification method and electronic equipment
CN117389454B (en) * 2022-07-01 2024-10-18 荣耀终端有限公司 Finger joint operation identification method and electronic equipment

Also Published As

Publication number Publication date
CN114567696B (en) 2024-07-30

Similar Documents

Publication Publication Date Title
CN106797416B (en) Screen control method and device
CN106357934B (en) Screen locking control method and device
CN106572299A (en) Camera switching-on method and device
CN111880757A (en) Screen projection method, screen projection device and storage medium
US10318069B2 (en) Method for controlling state of touch screen, and electronic device and medium for implementing the same
CN111813062A (en) Intelligent household equipment control method and device and storage medium
CN107566750A (en) Control method, device and the storage medium of flash lamp
CN114567696B (en) Application control method, application control device and storage medium
CN109905543B (en) Method and device for preventing misoperation of terminal
CN107734303B (en) Video identification method and device
CN106128415B (en) Screen luminance adjustment method and device
CN114185444A (en) Method and device for preventing mistaken touch of touch screen and storage medium
CN107948876B (en) Method, device and medium for controlling sound box equipment
CN111629104B (en) Distance determination method, distance determination device, and computer storage medium
CN112532789B (en) Ring tone processing method and device, terminal and storage medium
CN112905027A (en) Timing method and device, mobile terminal and storage medium
CN109408271B (en) Response method and device for sliding cover operation and storage medium
CN112462963A (en) Non-contact gesture control method and device and storage medium
US11402994B2 (en) Method and apparatus for calling shortcut function, and storage medium
CN111353427A (en) Fingerprint unlocking signal processing method, device and medium
CN108521503A (en) The control method and device of terminal device
CN111694496B (en) Electronic equipment control method and device
CN110647281B (en) False touch prevention method and device and storage medium
CN109492365B (en) Fingerprint information identification method and device
WO2023245332A1 (en) Sensor module, electronic device, induction and recognition method, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant