CN104317452B - Method for controlling large-screen intelligent device - Google Patents

Method for controlling large-screen intelligent device Download PDF

Info

Publication number
CN104317452B
CN104317452B CN201410584946.7A CN201410584946A CN104317452B CN 104317452 B CN104317452 B CN 104317452B CN 201410584946 A CN201410584946 A CN 201410584946A CN 104317452 B CN104317452 B CN 104317452B
Authority
CN
China
Prior art keywords
screen
finger
straight line
user
icon
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410584946.7A
Other languages
Chinese (zh)
Other versions
CN104317452A (en
Inventor
张行
包卫卫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Ruimingxin Technology Co ltd
Original Assignee
Phicomm Shanghai Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Phicomm Shanghai Co Ltd filed Critical Phicomm Shanghai Co Ltd
Priority to CN201410584946.7A priority Critical patent/CN104317452B/en
Publication of CN104317452A publication Critical patent/CN104317452A/en
Application granted granted Critical
Publication of CN104317452B publication Critical patent/CN104317452B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

The invention relates to a method for controlling a large-screen intelligent device, which is characterized in that a sensor of the intelligent device detects gestures performed on a screen by a current user so as to acquire coordinate data of the beginning and the end of a finger of the current user; the intelligent equipment calculates a curve or a linear equation corresponding to the length and/or the touch position of the current user finger according to the obtained coordinate data; and the intelligent equipment dynamically determines the icon exchange area on the screen according to the curve or straight line equation calculation, and rearranges the icons in the icon exchange area on the screen. The method and the device are convenient for the user to operate the icons on the screen and obtain better use experience.

Description

Method for controlling large-screen intelligent device
Technical Field
The invention relates to the field of intelligent equipment, in particular to a method for controlling large-screen intelligent equipment.
Background
The crowd who owns smart devices such as cell-phones and tablet computer is more and more, and it has brought many facilities to people's life to applied in many aspects. However, with the development of technology and new requirements of users, the original small-screen device cannot well meet the user experience. Based on this, some equipment manufacturers and research and development companies have developed intelligent equipment with larger screens. However, new problems arise, since the screen of the smart device becomes larger and some parameters are higher, the application installed on the device is full of information, and the controllability and faster interaction of the smart device become difficult. When the mobile phone or the tablet is operated by one hand, fingers of some application ICONs (ICON) on the large screen cannot be touched and cannot be opened. Especially for adults and children, the length of their fingers is different, and the range of the display screen operated by one hand is different and limited.
Disclosure of Invention
In order to solve the problems, the invention provides a method for controlling a large-screen intelligent device, which is characterized in that the finger coordinate or length of any user is obtained based on a sensor, and then the icon layout on a desktop is dynamically adjusted by combining mathematical calculation, a linear curve equation and the like, so that the user can conveniently click an icon which cannot be operated by the original single hand, and better experience is obtained.
In order to achieve the above object, the technical solution of the present invention is to provide a method for controlling a large-screen smart device, wherein a sensor of the smart device detects a gesture performed by a current user on a screen to obtain coordinate data of a start end and a tail end of a finger of the current user;
the intelligent equipment calculates a curve or a linear equation corresponding to the length and/or the touch position of the current user finger according to the obtained coordinate data;
and the intelligent equipment dynamically determines the icon exchange area on the screen according to the curve or straight line equation calculation, and rearranges the icons in the icon exchange area on the screen.
Preferably, after the sensor detects another gesture performed by the current user on the screen, the intelligent device acquires the coordinate data of the beginning and the end of the user's finger again, calculates the corresponding curve or straight line equation, dynamically determines a new icon exchange area and performs the rearrangement of the icons in the new icon exchange area.
Preferably, after the sensor detects another gesture performed by the user on the screen, the intelligent device acquires the coordinate data of the beginning and the end of the finger of the user again, calculates a corresponding curve or a straight line equation, dynamically determines a new icon exchange area and performs the rearrangement of the icons in the new icon exchange area.
Preferably, the process of dynamically determining the icon exchange area on the screen by the smart device comprises:
determining a first straight line connecting the starting end and the tail end of the user finger to the extension line of the user finger;
determining a second straight line which passes through the starting end of the finger of the user and is parallel to the width direction of the screen;
determining a third straight line which passes through the end of the finger of the user and is parallel to the width direction of the screen;
matching each boundary of the screen with a first straight line, a second straight line or a third straight line, and dividing the screen into a plurality of screen areas;
the icons in any two screen areas set as the icon exchange area are transposed.
Preferably, when the user operates the screen with a finger-bending gesture, the smart device determines a curve equation corresponding to the gesture, and determines a tangent line of a highest point in the curve equation by calculation:
if the gesture of the user is of a convex line curve, taking the most convex point on the convex line curve as a highest point, taking a curve tangent line passing through the highest point as a first straight line, taking a straight line passing through the highest point or passing through the initial end of the finger and being parallel to the bottom edge of the screen as a second straight line, and taking a straight line passing through the tail end of the finger and being parallel to the bottom edge of the screen as a third straight line;
if the gesture of the user is a concave curve, the initial end of the finger is taken as a highest point, a curve tangent line passing through the highest point is taken as a first straight line, a straight line passing through the highest point and parallel to the bottom edge of the screen is taken as a second straight line, and a straight line passing through the tail end of the finger and parallel to the bottom edge of the screen is taken as a third straight line.
Preferably, the icon exchange area is a first area located between a lower side of the first straight line and an upper side of the second straight line, and a second area located at a left side of the first straight line and between the second straight line and the third straight line.
Preferably, the icon exchange area is a first area located between a lower side of the first straight line and an upper side of the second straight line, and a third area located at a right side of the first straight line and between the second straight line and the third straight line.
Preferably, when the sensor of the intelligent device detects another gesture which is executed by the user and can cause the slope of the straight line between the start end and the end of the finger on the screen to change, the new division range of each icon exchange area on the screen is dynamically determined, and the icons in the icon exchange areas are rearranged.
Preferably, the sensor of the smart device detects a position of a fingertip of a finger of the user to acquire coordinates of a beginning end of the finger, and detects a position of a distal segment or a root of a finger of the user to acquire coordinates of a tail end of the finger.
Preferably, the sensor of the smart device detects a position of a fingertip point corresponding to one finger of the user to acquire coordinates of a start end of the finger, and detects a position of a fingertip point corresponding to another finger of the user to acquire coordinates of an end of the finger.
Compared with the prior art, the method for controlling the large-screen intelligent device has the advantages that: according to the method and the device, the gesture operation of the user on the screen is monitored in real time, the coordinates of the start end and the tail end of the finger are obtained, so that the corresponding icon exchange area is determined, the icon layout is adjusted, the user can not operate the icon on the screen by one hand, and better use experience is obtained.
Drawings
FIG. 1 is a schematic flow chart of a manipulation method according to the present invention;
FIG. 2 is a schematic diagram of an icon layout adjustment algorithm in the manipulation method according to the present invention;
FIG. 3 is a schematic diagram of a screen interface for adjusting the layout of icons in the manipulation method according to the present invention;
fig. 4 is a screen interface diagram of a specific example of icon layout adjustment in the manipulation method according to the present invention;
FIG. 5 is a schematic diagram of a screen interface when a gesture curve is a convex curve according to the manipulation method of the present invention;
fig. 6 is a schematic diagram of a screen interface when a gesture curve is a concave curve in the manipulation method according to the present invention.
Detailed Description
The invention provides a method for controlling a large-screen intelligent device, which is suitable for the condition that fingers of adults and children are different in length, and can dynamically change the layout of application icons on a touch screen according to the coordinates or the length of the fingers of a user; when the user points to a far place with hands, the application icons at the far place can be moved to the place which can be reached by the user, and the user can conveniently operate with one hand.
As shown in fig. 1, the method of the present invention works according to the following principle:
firstly, detecting a gesture of a user on a screen through a sensor at a background to acquire coordinates of a start end and a tail end of a finger of the user at present;
secondly, calculating a curve or a linear equation corresponding to the length of the current user finger according to the obtained coordinate data;
third, the icon positions on the screen are re-laid out based on the algorithm of the present invention (described in detail below).
Fourthly, if icons with larger distances need to be operated, the operation of the third step can be repeated in a recursive way.
Finally, the remote application icon can be moved to a place on the screen which can be reached by a user with one hand, and the user can conveniently carry out the operation as required.
Referring to fig. 2 and fig. 3, the icon layout adjustment algorithm provided by the present invention is as follows:
s1, determining a straight line (or curve) equation of the user gesture, further determining a first straight line 21 representing the starting and ending positions of the user finger on the screen 10, and calculating the slope of the first straight line 21 so as to know the direction of the user finger;
s2, determining a second straight line 22 on the screen 10, which passes through the beginning 11 of the user' S finger and is parallel to the width direction of the screen;
s3, determining a third straight line 23 on the screen 10, which passes through the end 12 of the user' S finger and is parallel to the width direction of the screen;
s4, determining a first area 31 on the screen 10 between below the first straight line 21 and above the second straight line 22;
s5, determining a second area 32 located on the screen 10 and located on the right side of the first straight line 21 and between the second straight line 22 and the third straight line 23;
s6, the icons of the first area 31 are dynamically released to the second area 32, and the icons of the second area 32 are dynamically released to the first area 31, that is, the icons of the first area 31 and the second area 32 are randomly exchanged. The first area 31 and the second area 32 are dynamically sized to match the length of each person's finger and the point at which the person touches the screen.
Alternatively, S5 may have other embodiments, which are denoted as S5', for example, a third area 33 located on the left side of the first straight line 21 and between the second straight line 22 and the third straight line 23 is determined on the screen 10; then, in the subsequent step S6', the icons of the first area 31 and the third area 33 may be exchanged.
Although not described in detail above, it can be understood from the drawings that the screen regions can be actually divided by matching the upper, lower, left and right boundaries of the screen with the first straight line 21, the second straight line 22, and the third straight line 23 (the first region 31 can be also expressed as a region surrounded by the right boundary of the screen, the first straight line 21, and the second straight line 22). In principle, two of all screen areas on the screen 10, which are collectively divided by the boundaries and straight lines, may be optionally used as icon exchange areas, and only a few preferred examples thereof are described above.
If it is assumed that the region division and adjustment method described in the above embodiments corresponds to the implementation of the left-handed (or right-handed) operation of the screen; it will be understood by those skilled in the art that when the screen is operated by the right hand (or left hand), the start and end points of the fingers on the screen, the direction of the first straight line, the division of the area, etc. may be changed accordingly, for example, in a position symmetrical to the left and right in fig. 3 and 4. However, the principle and process of dividing the icon exchange area by the algorithm in the right-hand operation and the left-hand operation are similar, and are not described in detail here.
The two dots in FIG. 3 represent the beginning 11 and end 12 of the finger abstractly; in this case, the slope calculation may be performed by using the line from the end 12 to the start 11 of the user's finger on the screen 10 to the extension thereof as the first straight line 21.
In addition, when a certain user operates the mobile phone irregularly, namely operates the screen under the condition that the fingers are bent, the intelligent device can roughly draw the curve through calculation, determine the highest point of the curve, and obtain the tangent line of the highest point of the gesture operation through calculus.
As shown in fig. 5, if the gesture curve is a convex line curve, the highest point is the most convex point Q (X, Y) of the curve, the tangent line F of the curve passing through the highest point corresponds to the first straight line, the straight line S passing through the highest point and parallel to the bottom edge of the screen corresponds to the second straight line, and the straight line T passing through the end (a point) of the finger and parallel to the bottom edge of the screen corresponds to the third straight line, then the corresponding icon exchange area can be determined according to the algorithm described above. Or, in another example of the convex curve, a straight line P passing through the start end (point B) of the finger and parallel to the bottom side of the screen is taken as a second straight line, and other operations are not changed.
As shown in fig. 6, if the gesture curve is a concave curve, the highest point is determined to be the beginning of the finger (point B), a curve tangent line s passing through the highest point (in this example, a straight line passing through the highest point and perpendicular to the bottom edge of the screen) is taken as a first straight line, a straight line s passing through the highest point and parallel to the bottom edge of the screen is taken as a second straight line, and a straight line t passing through the end of the finger (point a) and parallel to the bottom edge of the screen is taken as a third straight line, and then the corresponding icon exchange area can be determined according to the algorithm described above.
Referring to fig. 4, when the user touches the screen with a finger, the sensor (based on pressure or thermal sensing technology) on the smart device can directly acquire coordinates a (X1, Y1) and B (X2, Y2) of the two points on the screen corresponding to the end and the beginning of the finger. Then, icon exchange areas (a first area D and a second area D) are determined according to the above algorithm, and icons in the two areas are exchanged, respectively. In the above example, the icons in the two areas are randomly exchanged, and no specific constraint exists; in other embodiments, some corresponding conditions may be added to the icon exchange according to the actual situation.
As the slope k = | Y2-Y1|/| X2-X1|, of the straight line between the two points at the beginning and the end of the finger increases, the range of the region D, d becomes larger and larger with the increasing k, and the number of icons exchanged with each other also becomes larger. For example, the ICON ICON1 could be arbitrarily swapped with ICONs in ICON5-ICON12 at the beginning, and as k increases, the ICONs ICON2-ICON4 would then also be swapped with ICONs in the underlying area.
In order to operate icons which are further away, the user can adjust the gesture operation of the user, and the intelligent device recalculates a new icon exchange area based on the algorithm of the invention. For example, in the example shown in fig. 4, the slope may be increased to enlarge the icon exchange area by moving one finger of the user on the screen in such a manner that the end position is substantially stationary and the start position is moved to the left of the screen as shown in the figure; conversely, if the start point moves to the right of the figure, the slope can be decreased to decrease the icon switched area.
In some embodiments, the beginning of the finger is the tip of one finger (e.g., thumb or index finger) of the user, and the end of the finger corresponds to the distal segment or root of the one finger of the user (i.e., a position near the palm). For example, in the first step, the user touches a finger for operating the screen on the screen so that the sensor acquires coordinates of two points as the start and end points.
Alternatively, in other embodiments, the beginning of the finger is the tip of one of the user's fingers (e.g., the thumb) and the end of the finger is the tip of the other of the user's fingers (e.g., the index finger). For example, in the first step, the user performs a zoom-in gesture or a zoom-out gesture on the screen with two fingers, and the sensor acquires the touch positions of the fingertips of the two fingers as the starting ends.
If the same user adjusts the gesture of touching the screen, for example, if one finger is not fully extended to be fully extended, or if the gesture of enlarging the distance between two fingertips of the user is performed, which is equivalent to increasing the distance between the beginning and the end of the screen, the smart device will adjust the corresponding icon exchange area according to the above algorithm and rearrange the icon positions.
In different examples, each re-layout of the icons can be performed on the basis of the previous layout adjustment in a superimposed manner; it may be preset that the icon layout is temporarily adjusted every time the user performs an icon operation (or is set to be not operated within a time limit), and then the icon layout is returned to the icon layout before the adjustment.
In summary, the control method provided by the present invention monitors the gesture operation of the user on the screen in real time, and obtains the coordinates of the beginning and end points of the finger, so as to determine the corresponding icon exchange area and adjust the icon layout. The large-screen intelligent device is a device which can not be operated by a single hand of a user to all icons on a screen, and the method is particularly suitable for the device. The present invention is not limited to the solution of applying the method described in the above embodiments to other devices to improve the user experience.
While the present invention has been described in detail with reference to the preferred embodiments, it should be understood that the above description should not be taken as limiting the invention. Various modifications and alterations to this invention will become apparent to those skilled in the art upon reading the foregoing description. Accordingly, the scope of the invention should be determined from the following claims.

Claims (8)

1. A method for controlling a large-screen intelligent device is characterized in that,
the sensor of the intelligent equipment detects the gesture of the current user on the screen to acquire the coordinate data of the beginning and the end of the current user finger; the obtaining of the coordinate data of the start end and the end of the current user finger specifically includes: touch coordinates where the user's fingers start and stop;
the intelligent equipment calculates a linear equation corresponding to the length and the touch position of the current user finger according to the obtained coordinate data;
the intelligent equipment dynamically determines an icon exchange area on the screen according to linear equation calculation, and rearranges icons in the icon exchange area on the screen;
a process for dynamically determining an icon exchange area on a screen by the smart device, comprising:
determining a first straight line connecting the starting end and the tail end of the user finger to the extension line of the user finger;
determining a second straight line which passes through the starting end of the finger of the user and is parallel to the width direction of the screen;
determining a third straight line which passes through the end of the finger of the user and is parallel to the width direction of the screen;
matching each boundary of the screen with a first straight line, a second straight line and a third straight line, and dividing the screen into a plurality of screen areas; the icons in any two screen areas set as the icon exchange area are transposed.
2. The method of claim 1,
after the sensor detects another gesture performed by the current user on the screen, the intelligent device acquires coordinate data of the beginning and the end of the finger of the user again, calculates a corresponding linear equation, dynamically determines a new icon exchange area and performs rearrangement on the icons in the new icon exchange area.
3. The method of claim 1,
after the sensor detects another gesture performed by the user on the screen, the intelligent device acquires coordinate data of the finger start and end of the user again, calculates a corresponding linear equation, dynamically determines a new icon exchange area and performs rearrangement on the icons in the new icon exchange area.
4. The method of claim 1,
the icon swapping area is a first area located between below the first straight line and above the second straight line, and a second area located to the left of the first straight line and between the second straight line and the third straight line.
5. The method of claim 1,
the icon swapping area is a first area located between below the first straight line and above the second straight line, and a third area located to the right of the first straight line and between the second straight line and the third straight line.
6. The method of claim 1,
when the sensor of the intelligent device detects another gesture which is carried out by a user and can cause the slope of a straight line between the beginning and the end of the finger on the screen to change, the new dividing range of each icon exchange area on the screen is dynamically determined, and icons in the icon exchange areas are rearranged.
7. The method of claim 1,
the sensor of the intelligent device detects the position of a fingertip point corresponding to one finger of the user to acquire the coordinates of the initial end of the finger, and detects the position of the tail end or the root of the finger of the corresponding user to acquire the coordinates of the tail end of the finger.
8. The method of claim 1,
the sensor of the intelligent device detects the position of a fingertip point corresponding to one finger of a user to acquire the coordinates of the initial end of the finger and detects the position of a fingertip point corresponding to the other finger of the user to acquire the coordinates of the tail end of the finger.
CN201410584946.7A 2014-10-28 2014-10-28 Method for controlling large-screen intelligent device Active CN104317452B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410584946.7A CN104317452B (en) 2014-10-28 2014-10-28 Method for controlling large-screen intelligent device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410584946.7A CN104317452B (en) 2014-10-28 2014-10-28 Method for controlling large-screen intelligent device

Publications (2)

Publication Number Publication Date
CN104317452A CN104317452A (en) 2015-01-28
CN104317452B true CN104317452B (en) 2020-02-21

Family

ID=52372690

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410584946.7A Active CN104317452B (en) 2014-10-28 2014-10-28 Method for controlling large-screen intelligent device

Country Status (1)

Country Link
CN (1) CN104317452B (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104765556B (en) * 2015-03-05 2017-11-14 广东欧珀移动通信有限公司 A kind of method and device of mobile intelligent terminal interface block
CN106201236B (en) * 2015-04-30 2020-10-27 西安乐食智能餐具有限公司 Display method and device of interactive interface
CN105373334B (en) * 2015-11-25 2019-02-12 小米科技有限责任公司 Interactive screen control method and device
CN105718190A (en) * 2015-12-21 2016-06-29 中科创达软件科技(深圳)有限公司 Method and device for operating large screen mobile terminal by one hand
CN106569674A (en) * 2016-11-11 2017-04-19 努比亚技术有限公司 Mobile terminal and screen operation control method thereof
CN106775391B (en) * 2016-11-29 2020-04-21 努比亚技术有限公司 Interface switching device and method
CN107122151A (en) * 2017-04-22 2017-09-01 高新兴科技集团股份有限公司 A kind of Dynamic Distribution method and system of Urban Operation center large-size screen monitors
CN108132706A (en) * 2017-11-27 2018-06-08 北京用友政务软件有限公司 A kind of method of controlling operation thereof of terminal device, system and terminal device
CN110929550B (en) * 2018-09-20 2023-11-14 北京小米移动软件有限公司 Fingerprint identification method and device, electronic equipment and storage medium
CN110780799B (en) * 2019-10-21 2021-04-30 湖南新云网科技有限公司 Information display control method and device and storage medium
CN112306341A (en) * 2020-10-14 2021-02-02 广州朗国电子科技有限公司 Display area moving method and device, storage medium and electronic whiteboard
CN115033145A (en) * 2022-06-22 2022-09-09 上海集度汽车有限公司 Vehicle-mounted screen display method and device, vehicle and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013016127A (en) * 2011-07-06 2013-01-24 Kii corp Portable device, touch position adjustment method, object selection method, selected position determination method, and program
CN102999269A (en) * 2012-12-26 2013-03-27 东莞宇龙通信科技有限公司 Terminal and terminal control method
CN103019604A (en) * 2012-12-24 2013-04-03 东莞宇龙通信科技有限公司 Terminal and terminal operation and control method
CN103019568A (en) * 2012-12-21 2013-04-03 东莞宇龙通信科技有限公司 Terminal and icon display method
CN103324392A (en) * 2013-06-20 2013-09-25 广东欧珀移动通信有限公司 Method for adjusting graphical controls according to handheld position and touch mobile terminal
CN103399692A (en) * 2013-06-27 2013-11-20 东莞宇龙通信科技有限公司 Mobile terminal one-hand operation method and mobile terminal thereof
CN103842945A (en) * 2011-10-11 2014-06-04 国际商业机器公司 Object designation method, device and computer program

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013016127A (en) * 2011-07-06 2013-01-24 Kii corp Portable device, touch position adjustment method, object selection method, selected position determination method, and program
CN103842945A (en) * 2011-10-11 2014-06-04 国际商业机器公司 Object designation method, device and computer program
CN103019568A (en) * 2012-12-21 2013-04-03 东莞宇龙通信科技有限公司 Terminal and icon display method
CN103019604A (en) * 2012-12-24 2013-04-03 东莞宇龙通信科技有限公司 Terminal and terminal operation and control method
CN102999269A (en) * 2012-12-26 2013-03-27 东莞宇龙通信科技有限公司 Terminal and terminal control method
CN103324392A (en) * 2013-06-20 2013-09-25 广东欧珀移动通信有限公司 Method for adjusting graphical controls according to handheld position and touch mobile terminal
CN103399692A (en) * 2013-06-27 2013-11-20 东莞宇龙通信科技有限公司 Mobile terminal one-hand operation method and mobile terminal thereof

Also Published As

Publication number Publication date
CN104317452A (en) 2015-01-28

Similar Documents

Publication Publication Date Title
CN104317452B (en) Method for controlling large-screen intelligent device
TWI419023B (en) Use the touch device to control the positioning of the cursor on the screen
TWI585672B (en) Electronic display device and icon control method
KR101824388B1 (en) Apparatus and method for providing dynamic user interface in consideration of physical characteristics of user
TWI471756B (en) Virtual touch method
EP2521021B1 (en) Method and device for generating dynamically a touch keyboard
TWI584164B (en) Emulating pressure sensitivity on multi-touch devices
US20150185953A1 (en) Optimization operation method and apparatus for terminal interface
CN105117056B (en) A kind of method and apparatus of operation touch-screen
TWI432996B (en) A method for adjusting the display appearance of a keyboard interface being displayed on a touch display unit
WO2014158553A2 (en) Columnar fitted virtual keyboard
WO2014169597A1 (en) Text erasure method and device
TWI615747B (en) System and method for displaying virtual keyboard
CN106201314B (en) A kind of display methods and display device for realizing handwriting input on touch screen
Ahn et al. Evaluation of edge-based interaction on a square smartwatch
CN105302459B (en) Single-hand control method and device for terminal
US20160154489A1 (en) Touch sensitive edge input device for computing devices
KR101179584B1 (en) Virtual mouse display method on touchscreen and computer readable recording medium storing program performing the method
JP2014175012A (en) Mouse pointer control method
WO2017016333A1 (en) Screen adjustment method and device
CN101794194A (en) Method and device for simulation of input of right mouse button on touch screen
CN109558007B (en) Gesture control device and method thereof
US20170185282A1 (en) Gesture recognition method for a touchpad
CN104007913A (en) Electronic device and human-computer interaction method
KR101237127B1 (en) Cursor moving method in the touch screen keypad including sliding key

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20201211

Address after: 313028 Industrial Park, balidian Town, Huzhou City, Zhejiang Province

Patentee after: HUZHOU FENGYUAN AGRICULTURAL EQUIPMENT MANUFACTURE Co.,Ltd.

Address before: No.3666 Sixian Road, Songjiang District, Shanghai, 201620

Patentee before: Phicomm (Shanghai) Co.,Ltd.

TR01 Transfer of patent right
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: A method of controlling large screen intelligent device

Effective date of registration: 20210630

Granted publication date: 20200221

Pledgee: Zhejiang Tailong Commercial Bank Co.,Ltd. Huzhou Branch

Pledgor: HUZHOU FENGYUAN AGRICULTURAL EQUIPMENT MANUFACTURE Co.,Ltd.

Registration number: Y2021330000755

PE01 Entry into force of the registration of the contract for pledge of patent right
PC01 Cancellation of the registration of the contract for pledge of patent right

Date of cancellation: 20230811

Granted publication date: 20200221

Pledgee: Zhejiang Tailong Commercial Bank Co.,Ltd. Huzhou Branch

Pledgor: HUZHOU FENGYUAN AGRICULTURAL EQUIPMENT MANUFACTURE Co.,Ltd.

Registration number: Y2021330000755

PC01 Cancellation of the registration of the contract for pledge of patent right
TR01 Transfer of patent right

Effective date of registration: 20240412

Address after: 518000, 9-16A, Building 1, Hualian City Shanlin Garden, No. 339 Dongbin Road, Nanshan District, Shenzhen, Guangdong Province

Patentee after: Huang Ke

Country or region after: China

Address before: 313028 Industrial Park, balidian Town, Huzhou City, Zhejiang Province

Patentee before: HUZHOU FENGYUAN AGRICULTURAL EQUIPMENT MANUFACTURE Co.,Ltd.

Country or region before: China

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20240529

Address after: 518000 Area A, Floor 6, No. 4 Factory Building, Hengchangrong Hi tech Industrial Park, Hongtian Shangnan East Road, Huangpu Community, Xinqiao Street, Bao'an District, Shenzhen, Guangdong

Patentee after: Shenzhen ruimingxin Technology Co.,Ltd.

Country or region after: China

Address before: 518000, 9-16A, Building 1, Hualian City Shanlin Garden, No. 339 Dongbin Road, Nanshan District, Shenzhen, Guangdong Province

Patentee before: Huang Ke

Country or region before: China

TR01 Transfer of patent right