CN117111726A - Equipment control method and device - Google Patents

Equipment control method and device Download PDF

Info

Publication number
CN117111726A
CN117111726A CN202310158268.7A CN202310158268A CN117111726A CN 117111726 A CN117111726 A CN 117111726A CN 202310158268 A CN202310158268 A CN 202310158268A CN 117111726 A CN117111726 A CN 117111726A
Authority
CN
China
Prior art keywords
preset
application
interface
information
eye movement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310158268.7A
Other languages
Chinese (zh)
Inventor
齐宁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202310158268.7A priority Critical patent/CN117111726A/en
Publication of CN117111726A publication Critical patent/CN117111726A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/54Interprogram communication
    • G06F9/543User-generated data transfer, e.g. clipboards, dynamic data exchange [DDE], object linking and embedding [OLE]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/04Real-time or near real-time messaging, e.g. instant messaging [IM]

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a device control method and device, relates to the technical field of terminals, and can improve convenience of control devices. The method is applied to the electronic equipment and comprises the following steps: the electronic equipment detects a preset eye movement operation through a camera; responding to a preset eye movement operation, and changing a display mode of first preset information on a display screen; changing the display mode of the first preset information on the display screen comprises one or more of the following modes: changing the size of an interface where the first preset information is located, changing the shape of the interface where the first preset information is located, moving the interface where the first preset information is located, and rotating the interface where the first preset information is located.

Description

Equipment control method and device
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a device control method and apparatus.
Background
With the popularization and development of intelligent terminals, mobile phone software related to various aspects of life of people is continuously developed, great convenience is brought to life of people, and people spend a great deal of time on mobile phones every day to conduct work communication, social interaction, entertainment, shopping and the like. The user usually uses the mobile phone in a mode of touching the mobile phone screen, however, the operation mode requires the user to perform complicated operation with fingers, that is, the control mode of the mobile phone provided in the related art is single, so that the use experience of the user is poor.
Disclosure of Invention
The application provides a device control method and a device, which can assist in controlling a device according to eye movement. Therefore, the interactive control mode of the user and the equipment can be enriched, and the flexibility and convenience in using the equipment are improved, so that the use experience of the user is improved.
In a first aspect, the present application provides an apparatus control method, including: the electronic equipment detects a preset eye movement operation through a camera; responding to a preset eye movement operation, and changing a display mode of first preset information on a display screen; changing the display mode of the first preset information on the display screen comprises one or more of the following modes: changing the size of an interface where the first preset information is located, changing the shape of the interface where the first preset information is located, moving the interface where the first preset information is located, and rotating the interface where the first preset information is located.
Based on the method provided by the application, the display mode of the first preset information on the display screen can be changed through eye movement control, for example, the size of the interface where the first preset information is located is changed, the shape of the interface where the first preset information is located is changed, the interface where the first preset information is located is moved, and the interface where the first preset information is located is rotated, so that the electronic equipment can accurately display the information which the user really wants to browse. And further, the operation of the user can be simplified, and the convenience of the user for using the mobile phone is improved.
In a possible implementation manner of the first aspect, the first preset information is replaced with the second preset information in response to a preset eye movement operation, wherein the second preset information includes part or all of the first preset information. Based on the method, the display mode of the first preset information on the display screen can be changed, and meanwhile, the first preset information is replaced by the second preset information, so that the information included on the interface can be dynamically and adaptively adjusted according to the size, the shape and the like of the interface, and the display mode of the information on the interface can be enriched.
In another possible implementation of the first aspect, the preset eye movement operation includes one or more of the following: continuously watching the region where the first sub-preset information is located is longer than a first preset time period, continuously watching the region where the camera is located is longer than a second preset time period, continuously watching the region where the camera is located, simultaneously executing preset manual operation, and continuously watching the region where the first sub-preset information is located, simultaneously executing preset manual operation; the first sub-preset information is information included in an interface where the first preset information is located. Based on the method, the preset eye movement operation can be aimed at the camera or the first preset information, and the preset eye movement operation can be combined with the preset manual operation to change the display mode of the first preset information, so that the flexibility of the interface where the eye movement control changes the first preset information is improved.
In a possible implementation manner of the first aspect, the first sub-preset information is first preset information; or the first sub-preset information is a part of the content in the first preset information. Based on this, the gaze may be directed to the first preset information or a part of the content in the first preset information, thereby increasing the diversity of eye movement operations.
In a possible implementation manner of the first aspect, the first preset information includes any one of the following information: incoming call notification information, short message notification information, alarm clock reminding information, video call information and voice call information. Based on the method, the display mode of various information on the display screen can be changed through eye movement, so that the information which a user really wants to browse can be accurately displayed on the display screen, and convenience in operation of the user is improved.
In a possible implementation manner of the first aspect, the first electronic device detects a first preset eye movement operation through the first camera, where the first electronic device includes the first camera and the first display screen; and responding to a first preset eye movement operation, and executing target preset interaction actions by the first electronic equipment, wherein the first preset eye movement operation aims at a first preset application or preset information related to the first preset application on the first electronic equipment.
Based on the method provided by the application, the first electronic equipment can respond to the first preset eye movement operation aiming at the first preset application or the preset information related to the first preset application to execute the target preset interaction action. That is, the manual control can be assisted by the eye movement control to complete the preset interaction action between the user and the application, so that the convenience and the interestingness of the interaction between the user and the application can be improved, and the interaction efficiency between the user and the application can be improved.
In a possible implementation of the first aspect, the first preset eye movement operation comprises one or more of the following operations: the continuous fixation target area is larger than the first preset time period, the continuous fixation area of the first camera is larger than the second preset time period, the continuous fixation of the area of the first camera is performed on the first display screen while the preset manual operation is performed on the first display screen, and the continuous fixation of the target area is performed while the preset manual operation is performed on the first display screen. Based on the method, the preset eye movement operation can be aimed at the camera or the target area, and the preset eye movement operation can be combined with the preset manual operation to execute the target preset interaction action, so that the flexibility of executing the target preset interaction action by the eye movement control is improved.
In a possible implementation manner of the first aspect, the target area is an area where an interface of the first preset application is located; or the target area is an area where part of the content on the interface of the first preset application is located; or the target area is an area where the preset information related to the first preset application is located. Based on this, gazing at the area where the interface that can be directed to the first preset application is located; or the area where part of the content is located on the interface of the first preset application; or an area where preset information related to the first preset application is located, thereby increasing the diversity of eye movement operations.
In a possible implementation manner of the first aspect, the first preset application is a video application, and the target preset interaction comprises one or more of the following: switching to the next video of the current video played in the video application, and suspending the current video played in the video application; changing the playing speed of the current video played in the video application; alternatively, the first preset application is a music application and the target preset interaction comprises one or more of the following: switching to the next piece of music to the current piece of music played in the music application, and pausing the current piece of music played in the music application. Based on the method, interaction with different applications can be realized according to eye movement operation, convenience and flexibility of application using processes of a user can be improved, and efficiency of the user in using different applications is improved.
In a possible implementation manner of the first aspect, the preset information related to the first preset application is downloading progress information, and the target preset interaction includes: when the downloading progress information indicates that the downloading of the first preset application is completed, opening the first preset application; or, the preset information related to the first preset application is incoming call notification information, and the target preset interaction action comprises: answering the incoming call and refusing to answer the incoming call; or, the preset information related to the first preset application is short message notification information, and the target preset interaction action comprises: and opening the short message notification information and copying verification code information included in the short message notification information. Based on the method, interaction with different applications can be realized according to eye movement operation, convenience and flexibility of application using processes of a user can be improved, and efficiency of the user in using different applications is improved.
In a possible implementation manner of the first aspect, when the first electronic device is in the off-screen display state, the first electronic device automatically lights up the first display screen in response to a duration of time that the user continuously gazes at an area where the first camera is located being greater than a third preset duration. Based on this, the electronic device can automatically light up the first display screen in response to the user's gaze, so that the efficiency of switching the state of the screen of the electronic device can be improved.
In a possible implementation manner of the first aspect, before the electronic device automatically lights up the display screen, the method further includes: when the first electronic equipment is in a black screen state, the first electronic equipment automatically enters a screen-off display state in response to the fact that the duration of the user continuously watching the area where the first camera is located is longer than the fourth preset duration. Based on the above, the electronic device can automatically enter the off-screen display state in response to the gaze of the user, so that the efficiency of switching the state of the screen of the electronic device can be improved.
In a possible implementation manner of the first aspect, when the first camera detects the first preset eye movement operation, the method further includes: the first electronic equipment detects a first preset manual operation; responding to a first preset eye movement operation and a first preset manual operation, and executing a first preset action on a first preset file included on an interface of a first preset application displayed on a first display screen by the first electronic equipment; the first preset manual operation aims at a first preset file. Based on the method, the electronic equipment can respond to the eye movement operation and the manual operation to realize file transmission between the same application or different applications, so that flexibility and convenience of cross-application transmission or transmission on the same application are improved, and operation experience of a user is improved.
In a possible implementation manner of the first aspect, before the first camera detects the first preset eye movement operation, the method further includes: the first electronic device detects a second preset eye movement operation through the first camera; responding to a second preset eye movement operation and a first preset eye movement operation, and executing a first preset action on a first preset file included on an interface of a first preset application displayed on a first display screen by the first electronic equipment; the second preset eye movement operation aims at the first preset file. Based on the method, the electronic equipment can respond to the eye movement operation to realize file transmission between the same application or different applications, so that flexibility and convenience of cross-application transmission or transmission on the same application are improved, and operation experience of a user is improved.
In a possible implementation manner of the first aspect, the first preset action includes any one of the following actions: copying the first preset file and cutting the first preset file. Based on the method, the electronic equipment can respond to eye movement operation and/or manual operation to realize file copying or file cutting between the same application or different applications, so that flexibility and convenience of cross-application transmission or transmission on the same application are improved, and operation experience of a user is improved.
In a possible implementation manner of the first aspect, the first preset eye movement operation includes any one of the following: the area where part of the content on the interface of the continuous gazing first preset application is located is larger than the fifth preset duration, the area where the interface of the continuous gazing second preset application displayed on the first display screen is located is larger than the sixth preset duration, and the area where part of the content on the interface of the continuous gazing second preset application is located is larger than the seventh preset duration. Based on the above, the first preset eye movement operation can be aimed at the camera, the interface of the first preset application or the interface of the second preset application, and the electronic device can respond to the eye movement operation and/or the manual operation to realize file copying or file cutting between the same application or different applications, so that flexibility and convenience of cross-application transmission or transmission on the same application are improved, and operation experience of a user is improved.
In a possible implementation manner of the first aspect, the second preset eye movement operation includes any one of the following: the area where the interface of continuously watching the first preset file is located is longer than the eighth preset time length; the area where the icon continuously gazes at the first preset file is larger than the ninth preset time period. Based on the method, the electronic equipment can respond to the eye movement operation of the first preset file and the first preset application to realize file copying or file cutting between the same application or different applications, so that flexibility and convenience of cross-application transmission or transmission on the same application are improved, and operation experience of a user is improved.
In a possible implementation manner of the first aspect, the first preset eye movement operation is that an area where an interface of a second preset application displayed on the first display screen is continuously gazed at is greater than a sixth preset duration, and the first electronic device performs a first preset action on a first preset file included on an interface of the first preset application displayed on the first display screen, including: the first electronic device copies or cuts the first preset file to an interface of the second preset application. Based on the method, the electronic equipment can respond to eye movement operation and/or manual operation to realize file copying or file cutting among different applications, so that flexibility and convenience of cross-application transmission are improved, and operation experience of a user is improved.
In a possible implementation manner of the first aspect, the first preset eye movement operation is that an area where a part of content on an interface of the first preset application is continuously gazed at is greater than a fifth preset duration, and the first electronic device performs a first preset action on a first preset file included on an interface of the first preset application displayed on the first display screen, including: and the first electronic equipment copies or cuts the first preset file to an area where part of the content is located on the interface of the first preset application. The electronic equipment can respond to eye movement operation and/or manual operation to realize file copying or file cutting of the same application, so that the flexibility and convenience of transmission on the same application are improved, and the operation experience of a user is improved.
In a possible implementation manner of the first aspect, the first preset application is an album application, the second preset application is a chat application, the first preset manual operation is to press preset pictures on an interface of the album application, an area where the interface of the first preset eye movement operation continuously looks at the chat application is greater than a sixth preset duration, and the first electronic device performs a first preset action on a first preset file included on an interface of the first preset application displayed on the first display screen, where the first preset action includes: the first electronic device copies the preset picture to an interface of the chat application. Based on the method, the electronic equipment can respond to eye movement operation and manual operation to realize file copying or file cutting among different applications, so that flexibility and convenience of cross-application transmission are improved, and operation experience of a user is improved.
In a possible implementation manner of the first aspect, the first preset application is an album application, the second preset application is a chat application, the first preset manual operation is to press a preset picture on an interface of the album application, an area where a preset chat list on the interface of the first preset eye movement operation is to continuously watch the chat application is greater than a sixth preset duration, and the first electronic device performs a first preset action on a first preset file included on an interface of the first preset application displayed on the first display screen, where the first preset action includes: the first electronic device copies the preset picture to a preset chat list on an interface of the chat application. Based on this, the electronic apparatus can accurately realize the copying of the first preset file in response to the eye movement operation and the manual operation.
In a possible implementation manner of the first aspect, when the first camera detects the first preset eye movement operation, the method further includes: the second electronic equipment detects a second preset manual operation, wherein the second preset manual operation aims at a second preset file included on an interface of a third preset application displayed on a second display screen, and the second electronic equipment comprises the second display screen and a second camera; responding to the second preset manual operation and the first preset eye movement operation, and executing a second preset action on the second preset file by the second electronic equipment; and communication connection is established between the second electronic equipment and the first electronic equipment.
Based on the method provided by the application, the second electronic equipment and the first electronic equipment which are established with the communication connection can respond to the eye movement operation and/or the manual operation to realize the second preset action on the second preset file, thereby improving the flexibility and convenience of file transmission across equipment and improving the operation experience of users.
In a possible implementation manner of the first aspect, before the first camera detects the first preset eye movement operation, the method further includes: the second electronic device detects a third preset eye movement operation through the second camera, wherein the second electronic device comprises the second camera and a second display screen, and the third preset eye movement operation aims at a second preset file which is displayed on the second display screen and is included on an interface of a third preset application; responding to the third preset eye movement operation and the first preset eye movement operation, and executing a second preset action on a second preset file by the second electronic equipment; and communication connection is established between the second electronic equipment and the first electronic equipment. Based on the method provided by the application, the second electronic equipment and the first electronic equipment which are established with communication connection can respond to the eye movement operation to realize the second preset action on the second preset file, so that the flexibility and convenience of file transmission across equipment are improved, and the operation experience of a user is improved.
In a possible implementation manner of the first aspect, the second preset action includes any one of the following: copying or cutting the second preset file to an interface of a fourth preset application displayed on the first display screen, and copying or cutting the second preset file to an area where part of the content on the interface of the fourth preset application displayed on the first display screen is located. Based on the method provided by the application, the second electronic equipment and the first electronic equipment which are established with communication connection can respond to eye movement operation to copy or cut the second preset file, so that the flexibility and convenience of copying or cutting the file across equipment are improved, and the operation experience of a user is improved.
In a possible implementation manner of the first aspect, the first preset eye movement operation includes any one of the following: the area where the interface of the fourth preset application is continuously watched is larger than the tenth preset time period, and the area where the part of the content on the interface of the fourth preset application is continuously watched is larger than the eleventh preset time period. Based on the above, the first preset eye movement operation can aim at the area where the interface of the fourth preset application is located or the area where part of the content on the interface of the fourth preset application is located, so that the flexibility of executing the target preset interaction action by the eye movement control is improved.
In a possible implementation manner of the first aspect, the third preset eye movement operation includes any one of the following: the area where the interface of continuously watching the second preset file is located is larger than the twelfth preset time length; the area where the icon continuously gazes at the second preset file is larger than the thirteenth preset time period. Based on the above, the third preset eye movement operation can aim at the area where the interface of the second preset file is located or the area where the icon of the second preset file is located, so that the flexibility of executing the target preset interaction action by the eye movement control is improved.
In a possible implementation manner of the first aspect, before the second electronic device performs the second preset action on the second preset file, the method further includes: the second electronic device receives the transmission confirmation information, wherein the transmission confirmation information is used for confirming that the second electronic device executes a second preset action on the second preset file. Based on the above, the electronic device responds to the eye movement operation and/or the manual operation, and does not immediately execute the second preset action on the second preset file, but executes the second preset action on the second preset file after receiving the transmission confirmation information, so that the situation of mistransmission of the second preset file can be reduced, and the safety of user privacy is ensured.
In a possible implementation manner of the first aspect, the third preset application is a first desktop application, the fourth preset application is a second desktop application, the second preset manual operation is to press a preset document on an interface of the first desktop application, an area where the first preset eye movement operation continuously looks at the interface of the second desktop application is greater than a tenth preset duration, and after the second electronic device receives the transmission confirmation information, the second electronic device performs a second preset action on the second preset document, including: and the second electronic equipment copies the preset document to an interface of the second desktop application. Based on the above, after the second electronic device receives the manual operation, the eye movement operation and the transmission confirmation information, the second electronic device can transmit the preset document across devices in response to the operations, so that the efficiency and the flexibility of the transmission across devices can be improved, and the convenience of the transmission across devices is improved.
In a second aspect, the present application provides an electronic device comprising: a wireless communication module, a memory, and one or more processors. The wireless communication module, memory, and processor are coupled. Wherein the memory is for storing computer program code, the computer program code comprising computer instructions. The computer instructions, when executed by a processor, cause an electronic device to perform a method as in the first aspect and any of its possible implementations.
In a third aspect, the application provides a computer readable storage medium comprising computer instructions which, when run on an electronic device, cause the electronic device to perform a method as in the first aspect and any of its possible embodiments.
In a fourth aspect, the application provides a computer program product which, when run on a computer, causes the computer to perform the method as in the first aspect and any one of the possible embodiments. The computer may be the electronic device described above.
In a fifth aspect, the present application provides a chip system comprising one or more interface circuits and one or more processors. The interface circuit and the processor are interconnected by a wire. The chip system is applied to the electronic equipment comprising the communication module and the memory; the interface circuit is for receiving signals from the memory and transmitting signals to the processor, the signals including computer instructions stored in the memory. When the processor executes computer instructions, the electronic device performs the method as in the first aspect and any of the possible implementations.
It will be appreciated that the electronic device of the second aspect and any possible implementation manner thereof, the computer storage medium of the third aspect, the computer program product of the fourth aspect, and the chip system of the fifth aspect may be referred to as beneficial effects in the first aspect and any possible implementation manner thereof, and are not described herein.
Drawings
Fig. 1 is a schematic display diagram of a mobile phone according to an embodiment of the present application;
fig. 2 is a schematic diagram of a hardware architecture of an electronic device according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a software architecture according to an embodiment of the present application;
fig. 4 is a schematic diagram of another mobile phone according to an embodiment of the present application;
fig. 5 is a schematic diagram of a display of another mobile phone according to an embodiment of the present application;
fig. 6 is a schematic diagram of a display of another mobile phone according to an embodiment of the present application;
fig. 7 is a schematic diagram of a display of another mobile phone according to an embodiment of the present application;
fig. 8 is a schematic diagram of a display of another mobile phone according to an embodiment of the present application;
fig. 9 is a schematic diagram of a display of another mobile phone according to an embodiment of the present application;
fig. 10 is a schematic diagram of a display of another mobile phone according to an embodiment of the present application;
fig. 11 is a schematic diagram of a display of another mobile phone according to an embodiment of the present application;
fig. 12 is a schematic diagram of a display of another mobile phone according to an embodiment of the present application;
fig. 13 is a schematic diagram of a display of another mobile phone according to an embodiment of the present application;
fig. 14 is a schematic diagram of a display of another mobile phone according to an embodiment of the present application;
fig. 15 is a schematic diagram of a display of another mobile phone according to an embodiment of the present application;
Fig. 16 is a schematic diagram of a display of another mobile phone according to an embodiment of the present application;
FIG. 17 is a schematic diagram of file transfer between different electronic devices according to an embodiment of the present application;
FIG. 18 is a schematic diagram of file transfer between different electronic devices according to an embodiment of the present application;
fig. 19 is a schematic structural diagram of a chip system according to an embodiment of the present application.
Detailed Description
For clarity and conciseness in the description of the embodiments below and to facilitate easy understanding by those skilled in the art, a brief introduction to related terms, concepts or technologies is first presented.
Off screen display (Always On Display, AOD): meaning that after the intelligent terminal enters standby, information (e.g., time, weather, temperature, etc.) can be displayed on a part of the screen area and the other screen area can be maintained in an off-screen state.
In some embodiments, the user may touch the screen of the electronic device with a finger, e.g., control the electronic device by pressing the screen of the electronic device with the finger, sliding the finger over the screen of the electronic device, etc. However, this way of manually controlling the electronic device is not only relatively single, but may also cause the electronic device to display information that the user does not want to browse.
In order to enrich the interactive control modes of the user and the equipment and improve the flexibility and convenience when the equipment is used, the embodiment of the application provides an equipment control method. By way of example, the user may expand the information corresponding to the preset area on a larger area by continuously looking at the preset area on the screen of the electronic device for several seconds, so that the mobile phone may display the information that the user really wants to browse; or after the split screen is opened, the user transmits the file in a manual mode through eye movement; or after the user opens a certain application, the preset interaction action between the user and the application can be assisted by eye movement control.
In a specific implementation, taking the electronic device as a mobile phone as an example, as shown in (a) in fig. 1, when the user receives an incoming call of "XX", the area 101 on the screen of the mobile phone will display abbreviated information about the incoming call of "XX", and the user can select "reject" or "accept". However, because the area of the area 101 is generally small, the distance between the "reject" button and the "accept" button on the area 101 is relatively short, and thus, the user may easily touch by mistake. For example, when the user intends to select "reject," the finger accidentally presses "accept," and the occurrence of a false touch event may make the user experience worse. In one possible design, as shown in fig. 1 (b), after the user continuously gazes at the area 101 for a period of time greater than (or equal to) a preset period of time (e.g., 3 seconds), the incoming call thumbnail information is automatically expanded on the screen of the mobile phone, that is, the incoming call thumbnail information is not displayed any more, but the incoming call detailed information is displayed on the area 102 larger than the area 101. The incoming call detailed information and the incoming call thumbnail information are the same as the incoming call information about XX, and the incoming call detailed information and the incoming call thumbnail information can be different. Therefore, the user can be reduced from touching by mistake according to the eye movement redrawing software control, and the mobile phone can accurately display information which the user really wants to browse.
The device control method provided by the embodiment of the application can be applied to electronic devices. The electronic device may be, for example, a mobile phone (including a folding screen mobile phone and a tablet mobile phone, which is not limited by the embodiment of the present application), a tablet computer, a desktop computer (not limited by the embodiment of the present application), a handheld computer, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a personal digital assistant (personal digital assistant, PDA), an augmented reality (augmented reality, AR) \virtual reality (VR) device, etc., and the embodiment of the present application does not limit the specific form and function of the electronic device.
Fig. 2 is a schematic structural diagram of an electronic device 200 according to an embodiment of the present application. As shown in fig. 2, the electronic device 200 may include a processor 210, an external memory interface 220, an internal memory 221, a universal serial bus (universal serial bus, USB) interface 230, a charge management module 240, a power management module 241, a battery 242, an antenna 1, an antenna 2, a mobile communication module 250, a wireless communication module 260, an audio module 270, a speaker 270A, a receiver 270B, a microphone 270C, an earphone interface 270D, a sensor module 280, keys 290, a motor 291, an indicator 292, a camera 293, a display 294, a user identification module (subscriber identification module, SIM) card interface 295, and the like.
The sensor module 280 may include, among other things, a pressure sensor 280A, a gyroscope sensor 280B, a barometric sensor 280C, a magnetic sensor 280D, an acceleration sensor 280E, a distance sensor 280F, a proximity light sensor 280G, a fingerprint sensor 280H, a temperature sensor 280J, a touch sensor 280K, an ambient light sensor 280L, a bone conduction sensor 280M, and the like. The pressure sensor 280A may be configured to sense a pressure signal and convert the pressure signal into a usable output electrical signal according to a certain rule. Touch sensor 280K may be used to capture and record physical touches on electronic device 200.
It is to be understood that the structure illustrated in this embodiment does not constitute a specific limitation on the electronic apparatus 200. In other embodiments, the electronic device 200 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 210 may include one or more processing units such as, for example: the processor 210 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The charge management module 240 is configured to receive a charge input from a charger. The power management module 241 is used for connecting the battery 242, and the charge management module 240 and the processor 210. The power management module 241 receives input from the battery 242 and/or the charge management module 240 and provides power to the processor 210, the internal memory 221, the external memory, the display 294, the camera 293, the wireless communication module 260, and the like.
The wireless communication function of the electronic device 200 can be implemented by the antenna 1, the antenna 2, the mobile communication module 250, the wireless communication module 260, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 200 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network.
The mobile communication module 250 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied on the electronic device 200. The mobile communication module 250 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 250 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 250 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to speaker 270A, receiver 270B, etc.), or displays images or video through display screen 294.
The wireless communication module 260 may provide solutions for wireless communication including WLAN (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., as applied on the electronic device 200. The wireless communication module 260 may be one or more devices that integrate at least one communication processing module. The wireless communication module 260 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 210. The wireless communication module 260 may also receive a signal to be transmitted from the processor 210, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 250 of electronic device 200 are coupled, and antenna 2 and wireless communication module 260 are coupled, such that electronic device 200 may communicate with a network and other devices via wireless communication techniques. The wireless communication techniques may include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
The electronic device 200 implements display functions through a GPU, a display screen 294, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display screen 294 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 210 may include one or more GPUs that execute program instructions to generate or change display information.
The display 294 is used to display images, videos, and the like. The display 294 includes a display panel. In an embodiment of the present application, the display 294 may be used to display notification information on the electronic device 200, application interfaces of application programs, and other related interfaces, etc., such as a music playing interface of a music application program, incoming call information, etc. Optionally, the display 294 may also be used to display expanded incoming call information, expanded audio play information, etc. on the electronic device 200.
Among them, the display panel may employ a liquid crystal display (liquid crystal display, LCD), a light-emitting diode (LED), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED), a flexible light-emitting diode (flex light-emitting diode, FLED), a mini, micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like.
The electronic device 200 may implement a photographing function through an ISP, a camera 293, a video codec, a GPU, a display 294, an application processor, and the like. The ISP is used to process the data fed back by the camera 293. The camera 293 is used to capture still images or video. The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. Video codecs are used to compress or decompress digital video. The electronic device 200 may support one or more video codecs. In this way, the electronic device 200 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The camera 293 may include 1 to N cameras. Each camera includes a photosensitive element (e.g., charge-coupled Device (CCD)/complementary metal oxide semiconductor (Complementary Metal Oxide Semiconductor, CMOS)), through which the electronic Device 200 can sense light, collect photons, and convert the photons into an electrical Charge.
For example, the electronic device may include 2 front cameras and 3 rear cameras. The front camera may include a front main camera, a front sub camera, and a Flight (TOF) camera, among others. The TOF camera may include TX, which may be used to transmit optical signals (infrared light or laser pulses), and RX, which may be used to receive imaging, among other things. The TX may be, for example, an infrared light transmitter. RX can be, for example, a complementary metal oxide semiconductor (complementary metal oxide semiconductor, CMOS) or charge coupled device (charge coupled device, CCD) image sensor. Optionally, the front camera may further include other types of cameras, for example, a depth camera module, a black and white camera module, a macro camera module, and the like, which is not limited by the present application. Optionally, in the embodiment of the present application, the positions of the cameras are not limited, for example, the positions of the front cameras are distributed in a plurality of positions of the electronic device 200.
In the present application, the camera 293 may be used to track the movement of the user's eye. In some embodiments of the present application, the camera 293 may perform face recognition detection, that is, recognize whether there is a face currently, and determine whether the face position is frontal. Under the condition that the face is recognized to exist currently and the face position is the front, eyeball recognition detection can be performed, namely the eyeball position, the size of the lens and the positions of upper and lower eyelid are detected, and at least four points including the upper eyelid, the left edge of the lens, the right edge of the lens and the pupil are locked.
The external memory interface 220 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 200. The external memory card communicates with the processor 210 through an external memory interface 220 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card. Internal memory 221 may be used to store computer executable program code that includes instructions. The processor 210 executes various functional applications of the electronic device 200 and data processing by executing instructions stored in the internal memory 221. For example, in an embodiment of the present application, the processor 210 may include a memory program area and a memory data area by executing instructions stored in the internal memory 221. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device 200 (e.g., audio data, phonebook, etc.), and so on. In addition, the internal memory 221 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The electronic device 200 may implement audio functions through an audio module 270, a speaker 270A, a receiver 270B, a microphone 270C, an ear-headphone interface 270D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 270 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. Speaker 270A may be used to convert audio electrical signals into sound signals. Receiver 270B may be used to convert the audio electrical signal into a sound signal. Microphone 270C may be used to convert sound signals into electrical signals. The earphone interface 270D is for connecting a wired earphone.
Keys 290 include a power on key, a volume key, etc. The motor 291 may generate a vibration alert. The indicator 292 may be an indicator light that may be used to indicate a state of charge, a change in charge, etc. The SIM card interface 295 is for interfacing with a SIM card.
The methods in the following embodiments may be implemented in the electronic device 200 having the above-described hardware structure.
The software system of the electronic device 200 may employ a layered architecture, an event driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. Illustratively, an electronic device200 may be in a layered architecture The system. Optionally, in->The platform can be loaded with Magic +.>The system. Optionally, a->The platform can also be integrated with-> Mobile service (google mobile->) And (5) a module.
Embodiments of the application are configured in a layered mannerThe system is an example illustrating the software architecture of the electronic device 200. The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate via interfaces. In some embodiments, ->The system can comprise an application program layer, an application program framework layer and a An Zhuoyun time linerun) and system libraries, a hardware abstraction layer (hardware abstraction layer, HAL) and a kernel layer. It should be noted that the embodiment of the application is described as +.>The system is illustrated in other operating systems (e.g.System, etc.), the scheme of the present application can be implemented as long as the functions implemented by the respective functional modules are similar to those of the embodiment of the present application.
The application layer may include a series of application packages, among other things.
As shown in fig. 3, the application package may include applications for cameras, gallery, calendar, phone calls, map, navigation, wireless local area network (wireless local area networks, WLAN), bluetooth, music, video, short message, lock screen application, setup application, etc. Of course, the application layer may also include other application packages, such as a payment application, a shopping application, a banking application, a chat application, or a financial application, and the application is not limited thereto.
The camera application has shooting and video recording functions, and the electronic equipment can perform shooting or video recording in response to the operation of opening the camera application by a user. It will be appreciated that the photographing and video recording functions of the camera application may also be invoked by other applications. For example, the screen locking application may call a photographing function of the camera application, and perform portrait identification or face unlocking according to a photographed image.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions. For example, an activity manager, a window manager, a content provider, a view system, a resource manager, a notification manager, a Camera Service (Camera Service), etc., to which embodiments of the present application are not limited in any way.
The Camera Service can be started in the starting-up stage of the electronic equipment and can be used for transmitting and storing relevant information of the Camera.
In the embodiment of the application, the Camera Service can also comprise an eye management module. The eye management module is used for identifying the movement condition of the eyeball and controlling the electronic equipment 200 according to the movement condition of the eyeball. For example, when the user illuminates the display screen 294, the eye management module may be triggered to track the movement of the eyes. When the eye management module tracks that the eye gazes to a preset area for more than a preset time period (for example, 3 seconds), information (for example, incoming call information) corresponding to the preset area can be expanded in a larger area.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D image engines (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio video encoding formats, such as: MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
OpenGL ES is used to implement three-dimensional graphics drawing, image rendering, compositing, and layer processing, among others.
SGL is the drawing engine for 2D drawing.
Android Runtime (Android run) includes a core library and virtual machines. Android run is responsible for scheduling and management of the Android system. The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android. The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The HAL layer is encapsulation of a Linux kernel driver, provides an interface upwards, and shields implementation details of low-level hardware.
The HAL layer may include Wi-Fi HAL, audio (Audio) HAL, camera HAL, and the like.
Wherein the Camera HAL is the core software framework of the Camera. A variety of image processing algorithms may be included in the camera HAL. For example, the camera HAL may include an image processing algorithm such as a face recognition algorithm, a skin care algorithm, an age recognition algorithm, a light supplementing algorithm, a face thinning algorithm, an eye brightening algorithm, an acne removing algorithm, a wrinkle removing algorithm, a filter algorithm, a makeup algorithm, a hairstyle changing algorithm, a mosaic algorithm, a contrast algorithm, a saturation algorithm, a sharpening algorithm, a background blurring algorithm, and a high dynamic range image algorithm.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises display drive, camera drive, audio drive, sensor drive and the like. The Camera driving is a driving layer of the Camera device and is mainly responsible for interaction with hardware.
In the embodiment of the application, the camera driving can comprise driving corresponding to a front main camera, driving corresponding to a wide-angle camera, driving corresponding to a far-focus camera, driving corresponding to a rear camera and the like. The driving corresponding to the front-facing camera may include driving corresponding to the front-facing main camera and driving corresponding to the TOF camera.
The hardware layer includes a display, a camera, etc. The cameras can comprise a front main camera, a wide-angle camera, a far-focus camera, a rear camera and the like. The front-facing camera can comprise a front-facing main camera, a TOF camera and the like.
The technical solutions in the embodiments of the present application will be described below with reference to the accompanying drawings in the embodiments of the present application. Wherein, in the description of the application, unless otherwise indicated, "at least one" means one or more, and "a plurality" means two or more. In addition, in order to facilitate the clear description of the technical solution of the embodiments of the present application, in the embodiments of the present application, the words "first", "second", etc. are used to distinguish the same item or similar items having substantially the same function and effect. It will be appreciated by those of skill in the art that the words "first," "second," and the like do not limit the amount and order of execution, and that the words "first," "second," and the like do not necessarily differ.
For easy understanding, the device control method provided by the embodiment of the application is specifically described below with reference to the accompanying drawings.
In the present application, an eyeball tracking technique may be employed to track a gaze region of an eyeball. The eyeball tracking technology can track the gazing area of the eyeball according to the characteristic change of the eyeball and the periphery of the eyeball, can track the gazing area of the eyeball according to the iris angle change, and can actively project light beams such as infrared rays and the like to the iris to extract the characteristic so as to track the gazing area of the eyeball. Thus, the user can control the electronic equipment without touching the screen or keys, for example, the screen interface is operated, the control efficiency of the electronic equipment can be improved, the operation of the user is simplified, and the interaction effect between the user and the electronic equipment is enriched.
In some embodiments, the electronic device may further calibrate the eye movement control of the user according to face information, eyeball information and the like pre-recorded by the user, so as to obtain a more accurate tracking result. In one possible design, multiple different locations on the screen of the electronic device may display highlights and prompt the user to move the eye in accordance with the location of the highlights. Calibration is completed when the electronic device recognizes that the movement of the user's eye coincides with all highlights displayed on the screen.
It should be noted that, in the present application, tracking the movement of the eyeball may be performed by one camera or may be performed by a plurality of cameras, where the pixel resolutions of the plurality of cameras may be different. Preferably, a depth camera may be employed to track eye movement to improve accuracy in identifying eye movement.
It should be noted that, in the present application, the tracking of the movement of the eyeball may be performed when the electronic device is on screen, or when the electronic device is off screen. It will be appreciated that the off screen display (AOD) function is one that allows an electronic device to display useful thumbnail information. Some information, such as clocks, calendars, notifications, conversations, etc., may be minimized even when the cell phone screen is closed. The user can view the information in the most convenient way at any time and any place, and can click on the notification to quickly access the application program corresponding to the information. Alternatively, for an electronic device with an AOD function, the user may set information displayed in the AOD state by himself, for example, the information displayed in the AOD state may include music playing information, video playing information, incoming call notification information, short message notification information, and the like. Alternatively, the user may manually turn on the AOD function; or wake up the AOD state by looking at the electronic device in the black state. Alternatively, the AOD sensor of the electronic device may continually detect whether the user is gazing at the electronic device.
It should be noted that, the camera can continuously track the movement of the eyeballs of the user when the screen is on; or when the user uses the preset application, the camera tracks the movement of the eyeballs of the user; or when the user performs some operation (for example, presses a preset area), the camera tracks the movement of the eyeball of the user; or when certain information appears on the interface of the screen of the electronic equipment, the camera tracks the movement of the eyeballs of the user, and the like.
It should be noted that, in the embodiments of the present application, "the first electronic device" and "the second electronic device" may include one or more electronic devices. The following describes a specific application scenario of the device control method provided by the embodiment of the present application, taking an electronic device as an example of a mobile phone. The specific application scenarios provided by the embodiment of the application can comprise the following three types: (1) Changing the display mode of first preset information on a display screen according to eye movement control; (2) Completing target preset interaction actions between a user and a preset application in an auxiliary mode according to eye movement control; (3) And completing the preset operation on the same mobile phone or among different electronic devices according to the eye movement control assistance.
An application scenario (1) (i.e., changing the display mode of the first preset information on the display screen according to the eye movement control) will be described below.
It will be appreciated that in the related art, the display manner of various information on the display of the mobile phone is generally fixed, which results in that when the user wants to perform a certain operation (for example, clicking a "reject" button displayed on the display to reject an incoming call), there may be a case where the user is inconvenient to perform the operation, thereby causing a certain degree of trouble to the user. Typically, the user will improve this by adjusting the hand grip, assisting with the other hand, turning the head, turning the phone, etc. Obviously, this can make the use process of user more loaded down with trivial details, the problem that flexibility is relatively poor, and then reduce user's use experience.
In order to simplify the operation of a user and improve the convenience of the user for using the mobile phone, in the embodiment of the application, the mobile phone can detect the preset eye movement operation through the camera; and responding to the preset eye movement operation, the mobile phone can change the display mode of the first preset information on the display screen. The mobile phone can comprise a display screen and a camera.
In one possible design of the application, the mobile phone can change the display mode of the first preset information on the display screen by redrawing the control corresponding to the first preset information on the interface. Changing the display mode of the first preset information on the display screen comprises one or more of the following modes: changing the size of the interface where the first preset information is located, changing the shape of the interface where the first preset information is located (for example, changing the shape from a rectangle to a circle), moving the interface where the first preset information is located, rotating the interface where the first preset information is located, changing the frame style of the interface where the first preset information is located, changing the filling style of the interface where the first preset information is located, and the like. The interface may include a lock screen interface of the mobile phone, a negative 1 screen interface of the mobile phone, an interface of an application on the mobile phone, and the like. The first preset information may include any one of the following information: information which can be displayed on various mobile phones such as incoming call notification information, short message notification information, video call information, voice call information, alarm clock reminding information, application update information, application notification information and the like. Optionally, the mobile phone may rotate the interface where the first preset information is located to a position parallel to a line between the eyes of the user, so as to facilitate the user to view.
In some embodiments of the present application, the first preset information may be replaced with second preset information in response to a preset eye movement operation, wherein the second preset information includes part or all of the first preset information.
In one possible design of the present application, in response to the preset eye movement operation, the first preset information may be replaced with the second preset information while changing the size of the interface where the first preset information is located and moving the interface where the first preset information is located.
Wherein the preset eye movement operation may include, but is not limited to, the following ways and various reasonable combinations of the following ways: the area where the continuous gazing camera is located is greater than (or greater than or equal to) the first preset duration (for example, 3 seconds), the area where the continuous gazing camera is located is greater than (or greater than or equal to) the second preset duration (for example, 3 seconds), the preset manual operation is performed while the area where the continuous gazing camera is located is gazed, and the preset manual operation is performed while the area where the first sub preset information is located is gazed; a preset number of successive blinks (e.g., 1, 2), e.g., blinking at a frequency of 1 second 2 times or 2 seconds 2 times; the eyeball is rotated in any one direction.
It should be noted that the preset eye movement operation may further include one or more of the following: and executing preset limb operation while watching the area where the camera is located, making preset expressions while watching the area where the camera is located, sending out voice instructions while watching the area where the camera is located, and the like. Optionally, the area where the gaze camera is located may include: when the camera of the mobile phone comprises a camera, looking at the area where the camera is positioned; when the camera of the mobile phone comprises a plurality of cameras, any one or any plurality of areas where the cameras are located are watched.
Wherein the preset manual operation may include any one of the following operations: and selecting a frame on the display screen, sliding on the display screen, clicking on the display screen, and pressing on the display screen. The execution main body of the preset manual operation can be a user or an automatic clicking device, the user can use a finger or a touch pen to operate on a mobile phone screen, the automatic clicking device is usually a touch pen or a touch head, and the paths of the touch pen or the touch head can be set through a program to realize the sliding of different paths; or the path of the touch pen or the touch head is set through a program so as to realize pressing (or clicking) of different positions. It can be understood that sliding on the display screen may mean that a finger or a stylus slides in any direction from any point on the interface of the mobile phone; or sliding on the display screen may mean that a plurality of fingers or a stylus slides in any of a plurality of directions from any of a plurality of points on the interface of the mobile phone.
The first sub-preset information may be information included in an interface where the first preset information is located. The first sub preset information may be first preset information; or the first sub-preset information may be a part of the content in the first preset information.
In other embodiments of the present application, in response to the preset eye movement operation, the first preset information may be replaced with second preset information after the preset process, where the second preset information may include part or all of the first preset information, and the second preset information includes preset sub-information. Specifically, when the second preset information includes preset sub information, preset processing may be performed on the preset sub information to obtain second preset information after the preset processing. The preset sub-information may include, but is not limited to: the preset process may include, but is not limited to, copying, translating, opening in a browser, etc., language information, verification code information, link information, etc., that is different from the default language of the user's handset. It should be noted that, there is a correspondence between the preset processing and the preset sub-information, for example, when the preset sub-information is language information different from the default language of the user mobile phone, the preset processing is translation; when the preset sub-information is verification code information, the preset processing is copying.
For example, if the first preset information is incoming call notification information, the first preset information includes: the words "XX incoming call", the words "from beijing", "reject" button and "accept" button, then the second preset information may include: the words "XX incoming call", "reject" button, contact picture. If the first preset information is short message notification information, the first preset information includes: the words "XX", the words "your express has been put in …", then the second preset information may include: the words XX and XX are good, and the express delivery is put at the entrance of The first Community and is taken away in time. The second preset information after the preset processing may include: the words "XX", the words "you good", you express were put at the entrance of The first Community (first cell), please take away in time. The form and content of the second preset information after the preset processing are not limited in the present application, for example, if the second preset information includes: the word "XX Express here," the second preset information after the preset processing may include: the word "XX Express (Express)" here; or the second preset information after the preset processing may include: the word "XX express" here.
In a specific implementation manner of the present application, as shown in fig. 4 (a), when a user receives an incoming call, a first preset message, that is, an incoming call notification message, may be displayed on an area 402 on a screen 401 of the mobile phone, where the incoming call notification message may include: the words "XX incoming call", "reject" button 403 and "accept" button 404. In the related art, the mobile phone may receive a click operation of the reject button 403 by the user, and reject answering the incoming call of XX in response to the operation; or the handset may receive a click operation of the accept button 404 by the user, and accept to answer the incoming call of XX in response to the operation. However, because the area of the area 402 is generally smaller, the distance between the "reject" button 403 and the "accept" button 404 is relatively short, which is more likely to cause a user to make a false touch, for example, the user wants to reject the call of "XX", and presses the "accept" button 404 in an attempt to press the "reject" button 403 with a finger, thereby receiving the call of "XX". It will be appreciated that the difficulty of the user to reach the selection he wants can cause significant confusion to the user. In an embodiment of the present application, the user may assist in manually controlling the electronic device by means of eye movement, and the marker 405 may be used to indicate the location of the user's gaze.
In one possible design of the present application, in response to the user continuously gazing at the area 402 where the incoming notification information is located for more than a first preset duration (e.g., 3 seconds), the interface where the incoming notification information is located and the interface where the incoming notification information is located may be enlarged and moved downward. Meanwhile, the incoming call notification information may be replaced with incoming call expansion information.
As shown in (b) of fig. 4, the area 406 may display incoming call expansion information, where the incoming call expansion information may include: the words "XX incoming call", the words "from Beijing", "text message" button, "reminder" button, "reject" button, "accept" button, etc. It can be appreciated that the area of the area 406 is larger than that of the area 402, the distance between the "reject" button and the "accept" button is larger, the user can easily touch the "reject" button without adjusting the holding gesture of the hand or assisting with the other hand, the probability of false touch is smaller, the mobile phone can accurately display the information that the user really wants to browse, and the convenience of user operation can be improved. It will be appreciated that after the interface where the incoming call notification information is located is changed, the user may look at the area where the incoming call expansion information is located (for example, the looking area 406), alternatively, in response to the user's looking area being moved away from the area 406, the manner of displaying the incoming call notification information may be restored as in (a) of fig. 4.
In another possible design, in response to the user continuously gazing at the area where the camera is located for more than a second preset period of time (e.g., 3 seconds), the interface where the incoming notification information is located may be enlarged to full screen; meanwhile, the incoming call notification information may be replaced with another incoming call expansion information. Alternatively, in response to the user continuously gazing at the area 402 for more than a first preset duration, the area 402 of the first preset information may be zoomed in to full screen; meanwhile, the incoming call notification information may be replaced with still another incoming call expansion information. As shown in fig. 4 (c), the incoming call expansion displayed on the area 407 may include: the words "XX incoming call", the words "from Beijing", "text message" button ", the" reminder "button", the "reject" button ", the" accept "button", the head portrait of XX ", etc. It should be understood that 3 seconds is an example, and the preset duration in the embodiment of the present application may be other positive numbers.
In another specific implementation manner, taking the first preset information as the sms notification information as an example, as shown in (a) of fig. 5, when the user receives the sms notification information, the sms notification information may be displayed on an area 502 on a screen 501 of the mobile phone, where the sms notification information includes: the words "XX express," the words "you good," here XX express, express numbered 00 gives you the place at., "reply" button, "mark read" button, etc. In one possible design of the present application, in response to the user looking at the area 502 while the finger is sliding down on the screen, the interface at which the SMS notification information is located may be enlarged and the interface at which the SMS notification information is located may be moved down. Meanwhile, the short message notification information can be replaced by short message expansion information. As shown in fig. 5 (b), the area 503 may display the message expansion information, where the message expansion information may include: the word "you good", here XX express, is that you are put in the security room at the gate of the district for the express with the number of 00, please get in time. If there is a problem, I can contact at any time. "in another possible design, in response to the user continuously looking at the area 502 while the finger slides down on the screen, as shown in (c) of fig. 5, the interface where the short message notification information is located may be enlarged to full screen, and the short message expansion information may be displayed on the area 504.
In still another specific implementation, as shown in (a) of fig. 6, when a user plays music with a mobile phone, music playing information may be displayed on an area 602 on a display 601 of the mobile phone, and the music playing information may include: the words "Song 1", the words "singer 1", the music playing progress bar, "previous" button, "next" button, "pause" button, etc. In one possible design of the present application, in response to a user blinking at a frequency of 1 second and 2 times while looking at the area 602, the interface at which the music playing information is located may be enlarged, rotated, and moved while the music playing information may be replaced with music expansion information, which may include: the "previous" button, the "next" button, and the "pause" button. As shown in (b) of fig. 6, music play information may be displayed on the area 603. In some embodiments of the present application, the area 603 may be parallel to the line connecting the eyes of the user, so that the mobile phone may display information that is easy for the user to view and operate.
An application scenario (2) (i.e., completing a target preset interaction between a user and a preset application according to an eye movement assistance control) is described below.
In the related art, a user may complete a target preset interaction with a first preset application in a manual touch manner when using the first preset application. For example, when using a short video application, a user typically slides down on the short video application interface with a finger to switch to the next short video; when the downloading progress of the application reaches 100%, the user clicks the downloading progress of the application by using a finger to open the application; the user presses the region where the information (e.g., the verification code) is located with a finger to copy the information; the user can click a closing button on an interface corresponding to the alarm clock application by a finger to close the alarm clock reminding; when playing music, the user can click the button of 'next' with a finger to switch to the next song, and can click the button of 'pause' with a finger to pause playing music; when a user reads an electronic book, the user can use a sliding finger to turn pages or click a button of 'next page' with the finger to turn pages; when receiving the incoming call notification information, the user can click a reject button with a finger to reject answering the incoming call; when receiving the incoming call notification information, the user can click an accept button with a finger to answer the incoming call.
It can be appreciated that the manner in which such manual touch is performed is less flexible than the manner in which the interaction with the application is performed, and when the user is not convenient to perform the manual touch, the user is difficult to perform the interaction with the application. Based on this, it can be appreciated that the target preset interactions can include, but are not limited to,: opening the application, closing the alarm clock reminding, switching to the next music of the current music played in the music application, answering the call, refusing the call and the like.
In the embodiment of the application, a mobile phone a (also referred to as a first electronic device) can detect a first preset eye movement operation through a first camera, wherein the mobile phone a can include the first camera and a first display screen; responding to a first preset eye movement operation, and executing target preset interaction action by the mobile phone A, wherein the first preset eye movement operation aims at a first preset application or preset information related to the first preset application on the mobile phone A.
In one possible design of the present application, when the first preset application corresponds to N preset interactions, one preset eye movement operation corresponds to M preset interactions, where M and N are both positive integers, and M is less than or equal to N; in case that M is greater than 1, M preset interactions may be determined as target preset interactions.
It should be noted that, when the target preset interactive action includes a plurality of interactive actions, the manner in which the user can complete each target preset interactive action through the eye movement assistance may be the same or different. Taking the first preset application as an example of the music application, the target preset interaction actions corresponding to the music application may include: switch to the next song and pause playing the song. The user may switch to the next song by continuously looking at the display area of the "next" button for more than a first preset duration (e.g., 3 seconds), and pause playing music by continuously looking at the display area of the "pause" button for more than the first preset duration; or the user may switch to the next song and pause playing the music by looking at the display area of the music application for more than a first preset period of time.
In one possible design of the application, the mobile phone can assist in completing the target preset interaction action between the user and the first preset application according to the eye movement control, thereby improving the convenience of interaction between the user and the first preset application and improving the use experience of the user. It should be noted that the first preset application may be displayed on the lock screen interface, the minus 1 screen interface, the main interface, etc. of the mobile phone a. The first preset eye movement operation may be directed to a first preset application on the mobile phone a or preset information related to the first preset application. For example, the first preset eye movement operation may be directed to a short message application on the mobile phone a, or directed to a short message notification message on the mobile phone a related to the short message application. For another example, the first preset eye movement operation may be for an alarm clock application on the mobile phone a, or for alarm clock notification information on the mobile phone a related to the alarm clock application.
For example, when the preset information related to the first preset application may be download progress information, the target preset interaction may include: when the downloading progress information indicates that the downloading of the first preset application is completed, opening the first preset application; alternatively, when the preset information related to the first preset application is the incoming call notification information, the target preset interaction may include: answering the incoming call and refusing to answer the incoming call; or, the preset information related to the first preset application is short message notification information, and the target preset interaction action comprises: opening the short message notification information and copying verification code information included in the short message notification information; or when the preset information related to the first preset application is alarm clock notification information, the target preset interaction action may include: and closing the alarm clock to remind.
For example, when the first preset application is a video application, the target preset interactions may include one or more of the following: switching to the next video of the current video played in the video application, and suspending the current video played in the video application; changing the playing speed of the current video played in the video application; alternatively, when the first preset application is a music application, the target preset interactions may include one or more of the following: switching to the next piece of music to the current piece of music played in the music application, and pausing the current piece of music played in the music application.
It should be noted that the first preset eye movement operation may include, but is not limited to, the following ways and various reasonable combinations of the following ways: the continuous gaze target area is greater than (or greater than or equal to) a first preset time period (for example, 3 seconds), the continuous gaze target area is greater than (or greater than or equal to) a second preset time period (for example, 3 seconds), the continuous gaze target area is simultaneously performed on the first display screen, the continuous blink is performed for preset times (for example, 1 time and 2 times), for example, blink is performed at a frequency of 1 second and 2 times or 2 seconds and the eyeball is rotated towards any direction.
It should be noted that the first preset eye movement operation may further include one or more of the following: and executing preset limb operation while watching the area where the first camera is located, making preset expressions while watching the area where the first camera is located, sending out voice instructions while watching the area where the first camera is located, and the like.
Optionally, looking at the area where the first camera is located may include: when the first camera of the mobile phone comprises a camera, looking at the area where the camera is positioned; when the first camera of the mobile phone comprises a plurality of cameras, any one or any plurality of areas where the cameras are located are watched.
Wherein the preset manual operation may include any one of the following operations: and selecting a frame on the first display screen, sliding on the first display screen, clicking on the first display screen, and pressing on the first display screen. The execution main body of the preset manual operation can be a user or an automatic clicking device, the user can use a finger or a touch pen to operate on the first display screen, the automatic clicking device is usually the touch pen or the touch head, and the paths of the touch pen or the touch head can be set through a program to realize the sliding of different paths. It can be understood that sliding on the first display screen may mean that a finger or a stylus slides in any direction from any point on the interface of the mobile phone; or sliding on the first display screen may mean that a plurality of fingers or a stylus slides in any plurality of directions from any plurality of points on the interface of the mobile phone.
The target area may be an area where an interface of the first preset application is located; or the target area may be an area where a part of the content on the interface of the first preset application is located; or the target area may be an area in which the preset information related to the first preset application is located.
In a specific implementation of the present application, as shown in fig. 7 (a), when the user receives the sms notification information containing the verification code, the verification code 702 may be displayed on the area 701, and in response to the user continuously watching the area 701 for more than a preset period of time, the mobile phone a may copy the verification code 702 to the clipboard. Alternatively, as shown in fig. 7 (b), text 703 (i.e., "copied") may be displayed on the first display of handset a to alert the user that the verification code 702 has been copied to the clipboard.
In another specific implementation, as shown in fig. 8 (a), in response to a user triggering the operations of downloading and installing "game 1", the mobile phone a may start downloading the application "game 1". In the process of downloading and installing "game 1" in the mobile phone a, the mobile phone a may switch "game 1" to background downloading, as shown in (b) in fig. 8, and may display the download progress information 802 of "game 1" in the area 801; alternatively, as shown in fig. 8 (c), the download progress information 802 of "game 1" is displayed in the form of a hover sphere on the area 803. Illustratively, when the download progress information 802 indicates that the "game 1" download is complete, i.e., the download progress reaches 100%, the cell phone a may turn on "game 1" in response to the user continuously gazing at the area 801 for more than the first preset duration.
In yet another specific implementation, as shown in (a) of fig. 9, in response to a click operation of the application icon 901 of the "video" application by the user, the mobile phone a may open the "video" application; or in response to the user continuously watching the application icon 901 of the "video" application for more than a certain period of time, the mobile phone a may open the "video" application, as shown in (b) in fig. 9, the mobile phone a may display a certain interface in the "video" application, for example, the interface 902 may include a certain video created by "Zhang san" (also referred to as a current video), and in response to the user continuously watching the area 903 of the first camera for more than a second preset period of time, the mobile phone a may switch to a next video of the current video played in the "video" application. As shown in fig. 9 (c), the mobile phone a may display an interface 904, and the interface 904 may include a screen in the process of switching the current video to the next video. When the handover is completed, as shown in (d) of fig. 9, the mobile phone a may display an interface 905, and the interface 905 may include a video next to the current video, i.e., a video created by "li four".
In yet another specific implementation, as shown in fig. 10 (a), a "close once" button 1002, an alarm icon, the text "alarm," the text "alarm about to ring" of 8:05, and so on may be displayed on an area 1001 on the lock screen interface of the mobile phone a. Illustratively, in response to the user continuously gazing at the "close once" button 1002 for more than a first preset duration, handset a may turn off the present alarm alert. Alternatively, as shown in fig. 10 (b), the handset a may display text 1003 (i.e. "8:05 alarm has been turned off") to alert the user that 8:05 alarm has been turned off once.
It should be noted that, for a mobile phone with an off-screen display (AOD) function, when a user turns on the low-power-consumption off-screen display function, the first camera may be woken up by looking at the first camera in the AOD state. It will be appreciated that the AOD function is a function that allows a cell phone to display useful thumbnail information. Basic information such as a clock, calendar, notification, conversation, music, etc. can be minimized even when the first display screen is turned off. The user can view the notification in the most convenient way at any time and any place, and can click on the notification to quickly access the application corresponding to the notification.
Based on the above, when the mobile phone A is in the screen-off display state, the mobile phone A can automatically light the first display screen in response to the fact that the duration of the user continuously watching the area where the first camera is located is longer than the third preset duration. Before the electronic device automatically lights up the display screen, the method further comprises: when the mobile phone A is in the black screen state, the mobile phone A can automatically enter the off-screen display state in response to the fact that the duration of the user continuously watching the area where the first camera is located is longer than the fourth preset duration.
In some embodiments, when the user plays the audio and video, the screen of the mobile phone may be in a closed state (also referred to as a black screen state) as shown in (a) of fig. 11, or in an off screen display state as shown in (b) of fig. 11, where audio playing information may be displayed on the area 1101 of the screen of the mobile phone. When the screen of the mobile phone is in a black screen state, responding to the fact that the area where the first camera of the user continuously gazes at the mobile phone exceeds a fourth preset duration, and enabling the mobile phone to enter an AOD state.
When the electronic device is in the AOD state, in response to the user continuously gazing at the area 1102 of the first camera of the electronic device for more than a third preset period of time, the mobile phone may automatically light up the first display screen (as shown in (c) of fig. 11), and the first camera of the mobile phone a may be started to track the movement of the eyeball, where at this time, audio playing information may be displayed on the area 1103 on the first display screen. And responding to the fact that the area where the user continuously looks at the audio playing information exceeds the first preset duration when the mobile phone is in the bright screen state, and the mobile phone can open the audio application corresponding to the audio playing information. As shown in (d) of fig. 11, in response to the user continuously gazing at an area of the audio playing information (1103 in (c) of fig. 11) being greater than a first preset time period, an interface 1104 of an audio application corresponding to the audio playing information may be displayed on the first display screen.
According to the application, the manual control can be assisted by the eye movement control to complete the preset interaction action between the user and the application, so that the convenience and interestingness of the interaction between the user and the application can be improved, and the interaction efficiency between the user and the application is improved.
The following describes the application scenario (3) (i.e. completing the preset actions of the preset file on the same mobile phone or between different electronic devices according to the assistance of eye movement control).
In the embodiment of the application, when a first camera included in a mobile phone a (also referred to as a first electronic device) detects a first preset eye movement operation, the mobile phone a may detect a first preset manual operation; responding to a first preset eye movement operation and a first preset manual operation, wherein the mobile phone A can execute a first preset action on a first preset file included on an interface of a first preset application displayed on a first display screen included in the mobile phone A; the first preset manual operation aims at a first preset file.
Or before the first camera detects the first preset eye movement operation, the mobile phone a (also referred to as a first electronic device) can detect the second preset eye movement operation through the first camera included in the mobile phone a; responding to the second preset eye movement operation and the first preset eye movement operation, the mobile phone A can execute a first preset action on a first preset file included on an interface of a first preset application displayed on a first display screen included in the mobile phone A; the second preset eye movement operation aims at the first preset file.
The first preset action may include any one of the following actions: copying the first preset file and cutting the first preset file.
It should be noted that the first preset eye movement operation may include, but is not limited to, the following ways and various reasonable combinations of the following ways: the area where the part of the content is located on the interface of the first preset application is continuously gazed at is greater than (or greater than or equal to) the fifth preset time period (for example, 3 seconds), the area where the part of the content is located on the interface of the second preset application is continuously gazed at is greater than the sixth preset time period (for example, 3 seconds), the area where the part of the content is located on the interface of the second preset application is continuously gazed at is greater than the seventh preset time period (for example, 3 seconds), the area where the part of the content is located on the interface of the second preset application is continuously gazed at is greater than a certain preset time period (for example, 3 seconds), the preset manual operation is performed on the first display screen while the part of the content is located on the interface of the first preset application is gazed at, the preset manual operation is performed on the first display screen while the part of the content is located on the interface of the second preset application is gazed at the same time, for example, the preset manual operation is performed at a frequency of 1, 2 times (for example, 2 times of winking and 2 times) or 2 times of winking and optionally performed.
It should be noted that the first preset eye movement operation may further include one or more of the following: and executing preset limb operation while watching the area where the first camera is located, making preset expressions while watching the area where the first camera is located, sending out voice instructions while watching the area where the first camera is located, and the like.
Optionally, looking at the area where the first camera is located may include: when the first camera of the mobile phone comprises a camera, looking at the area where the camera is positioned; when the first camera of the mobile phone comprises a plurality of cameras, any one or any plurality of areas where the cameras are located are watched.
Wherein the preset manual operation or the first preset manual operation may include any one of the following operations: and selecting a frame on the first display screen, sliding on the first display screen, clicking on the first display screen, and pressing on the first display screen. The first preset manual operation execution main body can be a user or an automatic clicking device, the user can use a finger or a touch pen to operate on the first display screen, the automatic clicking device is usually a touch pen or a touch head, and the paths of the touch pen or the touch head can be set through a program to realize the sliding of different paths; or the path of the touch pen or the touch head is set through a program so as to realize pressing (or clicking) of different positions. It can be understood that sliding on the first display screen may mean that a finger or a stylus slides in any direction from any point on the interface of the mobile phone; or sliding on the first display screen may mean that a plurality of fingers or a stylus slides in any plurality of directions from any plurality of points on the interface of the mobile phone.
Wherein the second preset eye movement operation may include any one of the following: the area where the interface continuously gazes at the first preset file is greater than or equal to, or greater than or equal to, the following embodiments of the present application take greater than as examples) an eighth preset duration (e.g., 3 seconds); the area where the icon continuously gazes at the first preset file is larger than the ninth preset time period.
In another possible design, the preset transmission confirmation information needs to be received before the mobile phone a performs the first preset action on the first preset file included on the interface of the first preset application displayed on the first display screen. That is, after the mobile phone a receives the preset transmission confirmation information, the mobile phone a may execute the first preset action on the first preset file included on the interface of the first preset application displayed on the first display screen. The preset transmission confirmation information may be information input by the user and used for confirming transmission of the first preset file, and the preset transmission confirmation information may include, but is not limited to, the following: voice information, expression information, limb motion information, touch information, and the like.
In the present application, the first preset action for the first preset file on the same mobile phone may include the following two modes: mode 1) a first preset action of the mobile phone A on a first preset file between different applications on the mobile phone A; mode 2) a first preset action of the mobile phone a on the first preset file is performed on the same application of the mobile phone a. The first preset file may include, but is not limited to: pictures, mail, documents, audio, etc.
Mode 1) is described below (i.e., between different applications on the mobile phone a, the mobile phone a performs a first preset operation on a first preset file).
In one possible design of the present application, if the first preset eye movement operation is that the area where the interface of the second preset application displayed on the first display screen is continuously gazed at is greater than the sixth preset duration, the first electronic device executes a first preset action on a first preset file included on the interface of the first preset application displayed on the first display screen, including: the first electronic device copies or cuts the first preset file to an interface of the second preset application. It should be noted that, the number of applications included in the first preset application, the number of applications included in the second preset application, the morphological size of the display area of the interface of the first preset application, the morphological size of the display area of the interface of the second preset application, and the like are not limited in any way. Wherein the first preset application or the second preset application may include a mail application, a chat application, an album application, an image processing application, a memo application, a social application, and the like.
In one possible design of the present application, if the first preset application is an album application, the second preset application is a chat application, the first preset manual operation is to press a preset picture on an interface of the album application, an area where the interface of the first preset eye movement operation is to continuously watch the chat application is longer than a sixth preset duration, and the first electronic device executes a first preset action on a first preset file included on an interface of the first preset application displayed on the first display screen, where the first preset action includes: the first electronic device copies the preset picture to an interface of the chat application.
It will be appreciated that the mobile phone a may respectively display different applications in different areas of the first display screen in response to a split-screen operation by the user. Illustratively, as shown in fig. 12 (a), the left side of the first display screen may display an interface 1201, the interface 1201 including a chat application thereon; the right side of the first display shows an interface 1202, with the album application included on the interface 1202. As shown in fig. 13 (a), an interface 1301 is displayed on the upper left side of the mobile phone screen, and the interface 1301 includes a chat application; a lower left display interface 1302 of the first display screen, including a memo application on the interface 1301; the right side of the mobile phone screen displays an interface 1303, and the interface 1303 comprises an album application.
In a specific implementation manner, taking a first preset application as an album application and a second preset application as a chat application, the first preset file is a preset picture on an interface of the album application as an example. As shown in (a) of fig. 12, in response to the user pressing the picture 1203 in the album application on the mobile phone a and the user continuously watching the area where the interface of the chat application is located being longer than the sixth preset time period, as shown in (b) of fig. 12, the mobile phone a may copy the picture 1203 onto the interface of the chat application. Optionally, after successful transmission, the text 1204 shown in (b) of fig. 12, i.e. "sent", may be displayed on the area where the interface of the album application is located; the user-sent message 1205 may be displayed on the area where the chat application's interface is located. If the transmission fails, text 1206 shown in fig. 12 (c), i.e. "transmission failed", may be displayed on the area where the interface of the album application is located.
In another specific implementation, as shown in (a) of fig. 13, in response to a user pressing the picture 1304 on the interface of the album application on the mobile phone a and the user continuously watching the area where the interface of the memo application is located is greater than a sixth preset period of time, as shown in (b) of fig. 13, the mobile phone a may copy the picture 1304 on the interface of the memo application. Alternatively, after successful transmission, the text 1305, i.e. "sent", may be displayed on the area where the interface of the album application is located as shown in (b) of fig. 13; the transferred picture 1306 may be displayed as shown in (b) of fig. 13 on the area where the interface of the memo application is located. If the transmission fails, the text transmission failure can be displayed on the area where the interface of the album application is located.
In another possible design of the present application, the first preset application is an album application, the second preset application is a chat application, the first preset manual operation is to press a preset picture on an interface of the album application, the area where the preset chat list is located on the interface of the first preset eye movement operation to continuously watch the chat application is longer than a sixth preset duration, and the first electronic device executes a first preset action on a first preset file included on an interface of the first preset application displayed on the first display screen, including: the first electronic device copies the preset picture to a preset chat list on an interface of the chat application. In yet another specific implementation, taking the first preset application as an album application and the second preset application as a chat application, as shown in (a) of fig. 14, an interface 1401 is displayed on the left side of the first display screen, where the interface 1401 includes the chat application, and an area where the interface of the chat application is located includes an area 1403 where a chat list with "friend 1" is located, an area 1404 where a chat list with "friend 4" is located, and so on. The right side of the first display screen may display an interface 1402, with the interface 1402 including an album application thereon. In response to a pressing operation of the user on the picture 1405 on the interface of the album application and a region 1404 where the user continuously gazes at the preset chat list on the interface of the chat application is longer than a sixth preset time period, as shown in (b) of fig. 14, a popup window 1406 may be displayed on the region where the interface of the album application is located, and the popup window 1406 may include the text "send to friends 4? "," confirm "button," cancel "button, etc. Responsive to a user clicking on a "confirm" button included in the popup 1406; or in response to the user continuously gazing at the "confirm" button included in the pop-up window 1406 for more than some preset period of time, the cell phone a may send the picture 1405 to "friend 4". As shown in fig. 14 (c), after successful transmission, text 1407, i.e. "sent", may be displayed on the area where the album application interface is located; text 1408, i.e. "[ img1 ], can be displayed on the area where the user's chat list with" buddy 4 "is located, where" img1 "is the name of picture 1405. If the transmission fails, the text transmission failure can be displayed on the area where the interface of the album application is located.
According to the method provided by the application, the electronic equipment can respond to the eye movement operation and/or the manual operation to realize file transmission among different applications, so that the flexibility and convenience of cross-application transmission are improved, and the operation experience of a user is improved.
The following description of the method 2) (i.e., the same application as the mobile phone a, the mobile phone a performs a first preset action on the first preset file).
In one possible design of the present application, if the first preset eye movement operation is that an area where a part of content on an interface of the first preset application is continuously gazed is greater than a fifth preset duration, the first electronic device executes a first preset action on a first preset file included on an interface of the first preset application displayed on the first display screen, including: and the first electronic equipment copies or cuts the first preset file to an area where part of the content is located on the interface of the first preset application. It should be noted that, the number of applications included in the first preset application, the morphological size of the display area of the interface of the first preset application, and the like are not limited in the present application. The first preset application may include a mail application, a chat application, an album application, an image processing application, a memo application, a social application, and the like.
In a specific implementation of the present application, taking the first preset application as a desktop application as an example, as shown in (a) in fig. 15, in response to a user pressing an interface of the desktop application on the mobile phone a on a file 1501 and the user continuously watching an area 1502 for more than a fifth preset period of time, as shown in (b) in fig. 15, the mobile phone a may cut the file 1501 to an area 1503.
In another specific implementation, taking the first preset application as a desktop application, as shown in (a) of fig. 16, the folding screen mobile phone may include an a screen and a B screen, and in response to a pressing operation of a file 1601 included on an interface of the B screen desktop application of the mobile phone a by a user and that the user continuously looks at an area 1602 on the a screen of the mobile phone a for more than a fifth preset period of time, as shown in (B) of fig. 16, the mobile phone may copy the file 1601 to the area 1603.
According to the application, manual control can be assisted by eye movement control to realize the operation of the first preset file on the same application, so that the convenience and flexibility of file transmission are improved, and the trouble caused by inconvenient operation to a user is reduced.
The following describes a process of completing a second preset action of the second preset file between different electronic devices according to the eye movement control assistance.
In the embodiment of the application, when a first camera included in a mobile phone a (also referred to as a first electronic device) detects a first preset eye movement operation, a mobile phone B (also referred to as a second electronic device) can detect a second preset manual operation, where the second preset manual operation is directed to a second preset file included on an interface of a third preset application displayed on a second display screen included in the mobile phone B; responding to the second preset manual operation and the first preset eye movement operation, and the second electronic equipment can execute a second preset action on a second preset file; and communication connection is established between the second electronic equipment and the first electronic equipment.
Or before the first camera included in the mobile phone a (also referred to as a first electronic device) detects a first preset eye movement operation, the mobile phone B (also referred to as a second electronic device) may detect a third preset eye movement operation through the second camera included in the mobile phone B, where the third preset eye movement operation is directed to a second preset file included on an interface of a third preset application displayed on a second display screen included in the mobile phone B; responding to the third preset eye movement operation and the first preset eye movement operation, and the second electronic equipment can execute a second preset action on a second preset file; and communication connection is established between the second electronic equipment and the first electronic equipment.
It should be noted that, communication connection needs to be established in advance between the mobile phone a and the mobile phone B. Wherein establishing a communication connection may include, but is not limited to,: direct connection through wireless fidelity (Wireless Fidelity, WIFI), logging into the same account, etc.
Wherein the third preset eye movement operation may include any one of the following: the area where the interface of continuously watching the second preset file is located is larger than the twelfth preset time length; the area where the icon continuously gazes at the second preset file is larger than the thirteenth preset time period.
The second preset action may include any one of the following: copying or cutting the second preset file to an interface of a fourth preset application displayed on the first display screen, and copying or cutting the second preset file to an area where part of the content on the interface of the fourth preset application displayed on the first display screen is located. In one possible design, the interface of the fourth preset application may be an interface of the first electronic device for screen projection; or the interface of the fourth preset application may be an extended interface of the first electronic device.
The first preset eye movement operation includes any one of the following: the area where the interface of the fourth preset application is continuously watched is larger than the tenth preset time period, and the area where the part of the content on the interface of the fourth preset application is continuously watched is larger than the eleventh preset time period.
It should be noted that the first preset eye movement operation may include, but is not limited to, the following ways and various reasonable combinations of the following ways: the area where the interface of the fourth preset application is continuously gazed at is greater than (or greater than or equal to), the following embodiments take greater than as an example) the twelfth preset time period (for example, 3 seconds), the area where the part of the content on the interface of the fourth preset application is continuously gazed at is greater than the thirteenth preset time period (for example, 3 seconds), the preset manual operation is performed on the first display screen while the area where the first camera is gazed at, the preset manual operation is performed on the first display screen while the area where the interface of the fourth preset application is gazed at, the preset number of times (for example, 1 time, 2 times) is continuously blinked, for example, the eyeball is rotated in any direction by adopting the frequency of 1 second 2 times or 2 seconds 2 times.
It should be noted that the first preset eye movement operation may further include one or more of the following: and executing preset limb operation while watching the area where the first camera is located, making preset expressions while watching the area where the first camera is located, sending out voice instructions while watching the area where the first camera is located, and the like.
Optionally, looking at the area where the first camera is located may include: when the first camera of the mobile phone comprises a camera, looking at the area where the camera is positioned; when the first camera of the mobile phone comprises a plurality of cameras, any one or any plurality of areas where the cameras are located are watched.
Wherein the preset manual operation or the second preset manual operation may include any one of the following operations: and selecting a frame on the first display screen, sliding on the first display screen, clicking on the first display screen, and pressing on the first display screen. The first preset manual operation execution main body can be a user or an automatic clicking device, the user can use a finger or a touch pen to operate on the first display screen, the automatic clicking device is usually a touch pen or a touch head, and the paths of the touch pen or the touch head can be set through a program to realize the sliding of different paths; or the path of the touch pen or the touch head is set through a program so as to realize pressing (or clicking) of different positions. It can be understood that sliding on the first display screen may mean that a finger or a stylus slides in any direction from any point on the interface of the mobile phone; or sliding on the first display screen may mean that a plurality of fingers or a stylus slides in any plurality of directions from any plurality of points on the interface of the mobile phone.
Before the second electronic device performs the second preset action on the second preset file, the method further includes: the second electronic device receives the transmission confirmation information, wherein the transmission confirmation information is used for confirming that the second electronic device executes a second preset action on the second preset file. That is, the second electronic device does not immediately respond to the first preset eye movement operation and the first preset manual operation to perform the second preset action after receiving the first preset eye movement operation and the first preset manual operation, but may wait to receive the transmission confirmation information, and after receiving the transmission confirmation information, the second electronic device performs the second preset action on the second preset file. The transmission confirmation information may be, but not limited to, information input by the user and used for confirming transmission of the second preset file to the first electronic device, and the preset transmission confirmation information may include, but not limited to,: voice information, expression information, limb motion information, touch information, and the like.
In one possible design of the present application, the third preset application is a first desktop application, the fourth preset application is a second desktop application, the second preset manual operation is to press a preset document on an interface of the first desktop application, the first preset eye movement operation is to continuously look at an area where the interface of the second desktop application is located is greater than a tenth preset duration, and after the second electronic device receives the transmission confirmation information, the second electronic device performs a second preset action on the second preset document, including: and the second electronic equipment copies the preset document to an interface of the second desktop application.
In a specific implementation manner, taking the third preset application as the first desktop application and the fourth preset application as the second desktop application, as shown in (a) in fig. 17, the electronic device 1701 and the electronic device 1702 establish a WIFI direct connection relationship in advance, and in response to a pressing operation of a file 1703 (xx.doc) on an interface of the first desktop application on the electronic device 1701 by a user and that an area 1704 where the user continuously looks at the interface of the second desktop application on the electronic device 1702 is greater than a tenth preset duration, as shown in (b) in fig. 17, a pop window 1705 may be displayed on the interface of the first desktop application on the electronic device 1701, where the pop window 1705 includes a text "is sent to the electronic device 1? "," ok "button," cancel "button, etc. In response to a user clicking on the "ok" button included in the pop-up window 1705, as shown in fig. 17 (c), the file 1703 may be transferred (copied) to the interface of the second desktop application on the electronic device 1702. Optionally, after successful transmission, text 1706, i.e. "sent", may be displayed on the interface of the first desktop application on the electronic device 1701; the transmitted file 1707 may be displayed on an interface of a second desktop application on the electronic device 1702. If the transmission fails, the text "transmission failed" may be displayed on the interface of the first desktop application on the electronic device 1701.
In a specific implementation, taking the third preset application as the first video application and the fourth preset application as the second video application, as shown in (a) of fig. 18, the user is watching the video 1802 through the second video application on the electronic device 1801, as shown in (b) of fig. 18, the user is watching another video through the first video application on the electronic device 1803, and the electronic device 1801 and the electronic device 1803 are logged in with the same account of the user. In response to a user pressing the region 1804 of the preset video displayed on the interface of the first video application on the electronic device 1803 and a user continuously gazing at the region of the interface of the second video application on the electronic device 1801 for more than 5 seconds, the preset video may be transmitted (copied) to the interface of the first video application on the electronic device 1803. Optionally, after successful transmission, the text "sent" may be displayed on the interface of the first video application on the electronic device 1803, and as shown in fig. 18 (c), the preset video 1805 may be displayed on the interface of the second video application on the electronic device 1801. If the transmission fails, the text "transmission failed" may be displayed on the interface of the first video application on the electronic device 1803.
According to the application, manual control can be assisted by eye movement control to realize file transmission among different devices, so that flexibility and convenience of cross-device transmission are improved, and operation experience of a user is improved.
Some embodiments of the application provide an electronic device that may include: a touch screen, a memory, and one or more processors. The touch screen, memory, and processor are coupled. The memory is for storing computer program code, the computer program code comprising computer instructions. When the processor executes the computer instructions, the electronic device may perform the various functions or steps performed by the electronic device in the method embodiments described above. The structure of the electronic device may refer to the structure of the electronic device 200 shown in fig. 2.
Embodiments of the present application also provide a system on a chip (SoC) including at least one processor 1901 and at least one interface circuit 1902, as shown in fig. 19. The processor 1901 and the interface circuit 1902 may be interconnected by wires. For example, interface circuit 1902 may be used to receive signals from other devices (e.g., a memory of an electronic apparatus). For another example, the interface circuit 1902 may be used to send signals to other devices (e.g., the processor 1901 or a touch screen of an electronic apparatus). For example, the interface circuit 1902 may read instructions stored in a memory and send the instructions to the processor 1901. The instructions, when executed by the processor 1901, may cause the electronic device to perform the various steps of the embodiments described above. Of course, the system-on-chip may also include other discrete devices, which are not particularly limited in accordance with embodiments of the present application.
Embodiments of the present application also provide a computer readable storage medium, where the computer readable storage medium includes computer instructions, which when executed on an electronic device, cause the electronic device to perform the functions or steps performed by the electronic device in the method embodiments described above.
It will be apparent to those skilled in the art from this description that, for convenience and brevity of description, only the above-described division of the functional modules is illustrated, and in practical application, the above-described functional allocation may be performed by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to perform all or part of the functions described above.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical functional division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another apparatus, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and the parts displayed as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a readable storage medium. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions for causing a device (may be a single-chip microcomputer, a chip or the like) or a processor (processor) to perform all or part of the steps of the method described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read Only Memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely illustrative of specific embodiments of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present application should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (31)

1. A device control method, applied to an electronic device, the electronic device including a display screen and a camera, the method comprising:
the electronic equipment detects a preset eye movement operation through the camera;
responding to the preset eye movement operation, and changing the display mode of first preset information on the display screen;
the changing the display mode of the first preset information on the display screen comprises one or more of the following modes: changing the size of the interface where the first preset information is located, changing the shape of the interface where the first preset information is located, moving the interface where the first preset information is located, and rotating the interface where the first preset information is located.
2. The method of claim 1, wherein the step of determining the position of the substrate comprises,
and replacing the first preset information with second preset information in response to the preset eye movement operation, wherein the second preset information comprises part or all of the first preset information.
3. The method according to any one of claims 1-2, wherein,
the preset eye movement operation includes one or more of the following: continuously watching the area where the first sub-preset information is located is longer than a first preset time period, continuously watching the area where the camera is located is longer than a second preset time period, continuously watching the area where the camera is located, simultaneously executing preset manual operation, and continuously watching the area where the first sub-preset information is located, simultaneously executing preset manual operation;
the first sub-preset information is information included in an interface where the first preset information is located.
4. The method of claim 3, wherein the step of,
the first sub preset information is the first preset information; or (b)
The first sub-preset information is part of the content in the first preset information.
5. The method according to any one of claims 1 to 4, wherein,
the first preset information comprises any one of the following information: incoming call notification information, short message notification information, alarm clock reminding information, video call information and voice call information.
6. A device control method, characterized by comprising:
the method comprises the steps that a first electronic device detects a first preset eye movement operation through a first camera, wherein the first electronic device comprises the first camera and a first display screen;
And responding to the first preset eye movement operation, and executing target preset interaction action by the first electronic equipment, wherein the first preset eye movement operation aims at a first preset application on the first electronic equipment or preset information related to the first preset application.
7. The method of claim 6, wherein the step of providing the first layer comprises,
the first preset eye movement operation includes one or more of the following operations: the method comprises the steps that a continuously gazing target area is longer than a first preset time period, a continuously gazing area where a first camera is located is longer than a second preset time period, a preset manual operation is executed on a first display screen while the continuously gazing area where the first camera is located, and a preset manual operation is executed on the first display screen while the continuously gazing target area.
8. The method of claim 7, wherein the step of,
the target area is an area where the interface of the first preset application is located; or (b)
The target area is an area where part of the content on the interface of the first preset application is located; or (b)
The target area is an area where preset information related to the first preset application is located.
9. The method according to any one of claims 6 to 8, wherein,
the first preset application is a video application, and the target preset interaction action comprises one or more of the following: switching to the next video of the current video played in the video application, and suspending the current video played in the video application; changing the playing speed of the current video played in the video application; or,
the first preset application is a music application, and the target preset interaction action comprises one or more of the following: switching to the next piece of music of the current music played in the music application, and pausing the current music played in the music application.
10. The method according to any one of claims 6 to 8, wherein,
the preset information related to the first preset application is downloading progress information, and the target preset interaction action comprises: when the downloading progress information indicates that the downloading of the first preset application is completed, opening the first preset application; or,
the preset information related to the first preset application is incoming call notification information, and the target preset interaction action comprises: answering the incoming call and refusing to answer the incoming call; or,
The preset information related to the first preset application is short message notification information, and the target preset interaction action comprises: and opening the short message notification information and copying verification code information included in the short message notification information.
11. The method according to any one of claims 6-10, further comprising:
when the first electronic equipment is in a screen-off display state, the first electronic equipment automatically lights up the first display screen in response to the fact that the duration of the user continuously watching the area where the first camera is located is longer than a third preset duration.
12. The method of claim 11, further comprising, prior to the electronic device automatically illuminating the display screen:
when the first electronic equipment is in a black screen state, the first electronic equipment automatically enters the off-screen display state in response to the fact that the duration of the user continuously watching the area where the first camera is located is longer than a fourth preset duration.
13. The method of claim 6, wherein upon the first camera detecting the first preset eye movement operation, the method further comprises:
the first electronic equipment detects a first preset manual operation;
Responding to the first preset eye movement operation and the first preset manual operation, and executing a first preset action on a first preset file included on an interface of the first preset application displayed on the first display screen by the first electronic device; the first preset manual operation aims at the first preset file.
14. The method of claim 6, wherein prior to the first camera detecting the first preset eye movement operation, the method further comprises:
the first electronic device detects a second preset eye movement operation through the first camera;
responding to the second preset eye movement operation and the first preset eye movement operation, and executing a first preset action on a first preset file included on an interface of the first preset application displayed on the first display screen by the first electronic device;
wherein the second preset eye movement operation is directed to the first preset file.
15. The method according to any one of claims 13 to 14, wherein,
the first preset action comprises any one of the following actions: copying the first preset file and cutting the first preset file.
16. The method according to any one of claims 13-15, wherein,
the first preset eye movement operation includes any one of the following: the area where part of the content on the interface of the first preset application is continuously watched is larger than the fifth preset duration, the area where the interface of the second preset application displayed on the first display screen is continuously watched is larger than the sixth preset duration, and the area where part of the content on the interface of the second preset application is continuously watched is larger than the seventh preset duration.
17. The method according to any one of claims 14-16, wherein,
the second preset eye movement operation includes any one of the following: the area where the interface of the first preset file is continuously watched is larger than the eighth preset time length; and continuously watching the region where the icon of the first preset file is located is larger than the ninth preset time length.
18. The method of any of claims 13-17, wherein the first preset eye movement operation is to continuously look at an area where an interface of a second preset application displayed on the first display screen is located is greater than the sixth preset duration, the first electronic device performing a first preset action on a first preset file included on the interface of the first preset application displayed on the first display screen, including:
And the first electronic equipment copies or cuts the first preset file to an interface of the second preset application.
19. The method of any of claims 13-17, wherein the first preset eye movement operation is performed by continuously looking at a region where a portion of content on an interface of the first preset application is located for greater than the fifth preset duration, and the first electronic device performing a first preset action on a first preset file included on an interface of the first preset application displayed on the first display screen, including:
and the first electronic equipment copies or cuts the first preset file to an area where part of the content on the interface of the first preset application is located.
20. The method of any of claims 13 or 15-18, wherein the first preset application is an album application, the second preset application is a chat application, the first preset manual operation is pressing a preset picture on an interface of the album application, the first preset eye movement operation is continuously gazing at an area where the interface of the chat application is located for more than a sixth preset duration, and the first electronic device performs a first preset action on a first preset file included on the interface of the first preset application displayed on the first display screen, including:
And the first electronic equipment copies the preset picture to the interface of the chat application.
21. The method of any of claims 13 or 15-18, wherein the first preset application is an album application, the second preset application is a chat application, the first preset manual operation is pressing a preset picture on an interface of the album application, the first preset eye movement operation is continuously watching a preset chat list on the interface of the chat application for more than a sixth preset duration, and the first electronic device performs a first preset action on a first preset file included on the interface of the first preset application displayed on the first display screen, including:
and the first electronic equipment copies the preset picture to a preset chat list on the interface of the chat application.
22. The method of claim 6, wherein upon the first camera detecting the first preset eye movement operation, the method further comprises:
the method comprises the steps that a second electronic device detects a second preset manual operation, wherein the second preset manual operation aims at a second preset file included on an interface of a third preset application displayed on a second display screen, and the second electronic device comprises the second display screen and a second camera;
Responding to the second preset manual operation and the first preset eye movement operation, and executing a second preset action on the second preset file by the second electronic equipment;
and communication connection is established between the second electronic equipment and the first electronic equipment.
23. The method of claim 6, wherein prior to the first camera detecting the first preset eye movement operation, the method further comprises:
the second electronic device detects a third preset eye movement operation through a second camera, wherein the second electronic device comprises the second camera and a second display screen, and the third preset eye movement operation aims at a second preset file which is displayed on the second display screen and is included on an interface of a third preset application;
responding to the third preset eye movement operation and the first preset eye movement operation, and executing a second preset action on the second preset file by the second electronic equipment;
and communication connection is established between the second electronic equipment and the first electronic equipment.
24. The method according to any one of claims 22-23, wherein,
the second preset action includes any one of the following: and copying or cutting the second preset file to an interface of a fourth preset application displayed on the first display screen, and copying or cutting the second preset file to an area where part of content on the interface of the fourth preset application displayed on the first display screen is located.
25. The method according to any one of claims 22-24, wherein,
the first preset eye movement operation includes any one of the following: the area where the interface of the fourth preset application is continuously watched is larger than the tenth preset time period, and the area where the part of the content on the interface of the fourth preset application is continuously watched is larger than the eleventh preset time period.
26. The method according to any one of claims 23-25, wherein,
the third preset eye movement operation includes any one of the following: the area where the interface of the second preset file is continuously watched is larger than the twelfth preset time period; the area where the icon of the second preset file is continuously watched is larger than thirteenth preset time length.
27. The method of any of claims 22-26, wherein prior to the second electronic device performing a second preset action on the second preset file, the method further comprises:
and the second electronic equipment receives transmission confirmation information, wherein the transmission confirmation information is used for confirming that the second electronic equipment executes the second preset action on the second preset file.
28. The method of any of claims 22 or 24-27, wherein the third preset application is a first desktop application, the fourth preset application is a second desktop application, the second preset manual operation is to press a preset document on an interface of the first desktop application, the first preset eye movement operation is to continuously look at an area of the interface of the second desktop application for more than a tenth preset time period, and after the second electronic device receives the transmission confirmation information, the second electronic device performs a second preset action on the second preset document, including:
And the second electronic equipment copies the preset document to an interface of the second desktop application.
29. An electronic device, the electronic device comprising: a wireless communication module, a memory, and one or more processors; the wireless communication module, the memory, and the processor are coupled;
wherein the memory is for storing computer program code, the computer program code comprising computer instructions; the computer instructions, when executed by the processor, cause the electronic device to perform the method of any one of claims 1-28.
30. A computer-readable storage medium comprising computer instructions;
the computer instructions, when run on an electronic device, cause the electronic device to perform the method of any one of claims 1-28.
31. A chip system comprising one or more interface circuits and one or more processors; the interface circuit and the processor are interconnected through a circuit;
the chip system is applied to electronic equipment comprising a communication module and a memory; the interface circuit is configured to receive a signal from the memory and to send the signal to the processor, the signal including computer instructions stored in the memory; the electronic device, when executing the computer instructions, performs the method of any of claims 1-28.
CN202310158268.7A 2023-02-13 2023-02-13 Equipment control method and device Pending CN117111726A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310158268.7A CN117111726A (en) 2023-02-13 2023-02-13 Equipment control method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310158268.7A CN117111726A (en) 2023-02-13 2023-02-13 Equipment control method and device

Publications (1)

Publication Number Publication Date
CN117111726A true CN117111726A (en) 2023-11-24

Family

ID=88793556

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310158268.7A Pending CN117111726A (en) 2023-02-13 2023-02-13 Equipment control method and device

Country Status (1)

Country Link
CN (1) CN117111726A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150093016A (en) * 2014-02-06 2015-08-17 삼성전자주식회사 Dispaly apparatus and controlling method thereof
CN106354259A (en) * 2016-08-30 2017-01-25 同济大学 Automobile HUD gesture-interaction-eye-movement-assisting system and device based on Soli and Tobii
CN107608514A (en) * 2017-09-20 2018-01-19 维沃移动通信有限公司 Information processing method and mobile terminal
CN110007766A (en) * 2019-04-15 2019-07-12 中国航天员科研训练中心 A kind of display of sight cursor and control method and system
CN112799516A (en) * 2021-02-05 2021-05-14 深圳技术大学 Screen content adjusting method and system
CN115454233A (en) * 2022-07-25 2022-12-09 北京航空航天大学 Multi-screen interaction method and device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150093016A (en) * 2014-02-06 2015-08-17 삼성전자주식회사 Dispaly apparatus and controlling method thereof
CN106354259A (en) * 2016-08-30 2017-01-25 同济大学 Automobile HUD gesture-interaction-eye-movement-assisting system and device based on Soli and Tobii
CN107608514A (en) * 2017-09-20 2018-01-19 维沃移动通信有限公司 Information processing method and mobile terminal
CN110007766A (en) * 2019-04-15 2019-07-12 中国航天员科研训练中心 A kind of display of sight cursor and control method and system
CN112799516A (en) * 2021-02-05 2021-05-14 深圳技术大学 Screen content adjusting method and system
CN115454233A (en) * 2022-07-25 2022-12-09 北京航空航天大学 Multi-screen interaction method and device

Similar Documents

Publication Publication Date Title
US11983334B2 (en) Display method and related apparatus
EP3979628B1 (en) Display method of video call applied to electronic device and related apparatus
US11921987B2 (en) System navigation bar display method, system navigation bar control method, graphical user interface, and electronic device
EP4057135A1 (en) Display method for electronic device having foldable screen, and electronic device
CN110543287B (en) Screen display method and electronic equipment
WO2021000881A1 (en) Screen splitting method and electronic device
CN112671976B (en) Control method and device of electronic equipment, electronic equipment and storage medium
US20240077987A1 (en) Widget display method and electronic device
WO2021169399A1 (en) Method for caching application interface, and electronic apparatus
EP3982239A1 (en) Input method and electronic device
WO2020088633A1 (en) Payment method, device, and user equipment unit
US20230205417A1 (en) Display Control Method, Electronic Device, and Computer-Readable Storage Medium
CN113852714B (en) Interaction method for electronic equipment and electronic equipment
CN116048243B (en) Display method and electronic equipment
CN114201738A (en) Unlocking method and electronic equipment
US20240303022A1 (en) Method for invoking capability of another device, electronic device, and system
US20230280894A1 (en) Control moving method and electronic device
CN117111726A (en) Equipment control method and device
US12147273B2 (en) Video call display method applied to electronic device and related apparatus
CN116991274B (en) Upper sliding effect exception handling method and electronic equipment
WO2024179082A1 (en) Message processing method, electronic device, and readable storage medium
EP4462252A1 (en) Display method and electronic device
US20240232304A9 (en) Access control method and related apparatus
WO2024037379A1 (en) Notification checking method and system, and related apparatus
CN117311586A (en) Handwriting input method and terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination