CN112138385B - Virtual shooting prop aiming method and device, electronic equipment and storage medium - Google Patents

Virtual shooting prop aiming method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN112138385B
CN112138385B CN202011170818.XA CN202011170818A CN112138385B CN 112138385 B CN112138385 B CN 112138385B CN 202011170818 A CN202011170818 A CN 202011170818A CN 112138385 B CN112138385 B CN 112138385B
Authority
CN
China
Prior art keywords
sight
virtual
pattern
target object
resistance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011170818.XA
Other languages
Chinese (zh)
Other versions
CN112138385A (en
Inventor
潘达
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202011170818.XA priority Critical patent/CN112138385B/en
Publication of CN112138385A publication Critical patent/CN112138385A/en
Application granted granted Critical
Publication of CN112138385B publication Critical patent/CN112138385B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention provides a method and a device for aiming a virtual shooting prop, electronic equipment and a storage medium; the method comprises the following steps: presenting a virtual shooting prop in a virtual scene and a sight bead pattern corresponding to the virtual shooting prop; controlling the sight star pattern to move towards the target object in response to a sighting operation for the target object triggered based on the virtual shooting prop; when the sight bead pattern moves to a target area with the target object as the center, generating resistance aiming at the sight bead pattern in the moving direction so as to control the virtual shooting prop to aim at the target object based on the resistance; by the method and the device, the user can be assisted in completing the aiming operation in the virtual scene, the human-computer interaction efficiency is improved, and the occupation of hardware processing resources is reduced.

Description

Aiming method and device of virtual shooting prop, electronic equipment and storage medium
Technical Field
The invention relates to the technical field of virtualization and man-machine interaction, in particular to a method and a device for aiming a virtual shooting prop, electronic equipment and a storage medium.
Background
With the development of computer technology, electronic devices can realize more abundant and vivid virtual scenes. The virtual scene refers to a digital scene outlined by a computer through a digital communication technology, and a user can obtain a fully virtualized feeling (for example, virtual reality) or a partially virtualized feeling (for example, augmented reality) in the aspects of vision, hearing and the like in the virtual scene, and simultaneously can interact with various objects in the virtual scene or control interaction among various objects in the virtual scene to obtain feedback.
With the development of the virtual scene, the process of object interaction in the virtual scene is more and more complex and diversified. In the related technology, when a user controls a virtual shooting prop to aim an interactive object, accurate aiming of the interactive object is difficult to achieve, and man-machine interaction operations such as screen sliding and moving are often required to be executed for many times, however, the aiming effect is still not ideal, and the experience of the user in a virtual scene is greatly influenced.
Disclosure of Invention
The embodiment of the invention provides a method and a device for aiming a virtual shooting prop, electronic equipment and a storage medium, which can assist a user in finishing aiming operation in a virtual scene, improve the human-computer interaction efficiency and reduce the occupation of hardware processing resources.
The technical scheme of the embodiment of the invention is realized as follows:
the embodiment of the invention provides a method for aiming a virtual shooting prop, which comprises the following steps:
presenting a virtual shooting prop in a virtual scene and a sight bead pattern corresponding to the virtual shooting prop;
controlling the sight star pattern to move towards the target object in response to a sighting operation for the target object triggered based on the virtual shooting prop;
when the sight bead pattern moves to a target area centered on the target object, generating a resistance force in the moving direction for the sight bead pattern to control the virtual shooting prop to aim the target object based on the resistance force.
The embodiment of the invention also provides a device for aiming the virtual shooting prop, which comprises:
the presentation module is used for presenting the virtual shooting props in the virtual scene and the sight patterns corresponding to the virtual shooting props;
the control module is used for responding to aiming operation aiming at a target object triggered based on the virtual shooting prop and controlling the sight bead pattern to move towards the target object;
and the generating module is used for generating resistance aiming at the sight pattern in the moving direction when the sight pattern moves to a target area taking the target object as the center so as to control the virtual shooting prop to aim at the target object based on the resistance.
In the above scheme, the presentation module is further configured to present an operation control of the virtual shooting prop in an interface of the virtual scene;
and when the operation control is in an activated state, responding to the triggering operation aiming at the operation control, and presenting the virtual shooting prop and the sight pattern corresponding to the virtual shooting prop.
In the above scheme, the presentation module is further configured to present a selection interface including at least two candidate virtual shooting properties;
and receiving a selection operation aiming at the candidate virtual shooting prop triggered based on the selection interface, and taking the selected candidate virtual shooting prop as the virtual shooting prop.
In the above scheme, the control module is further configured to present an aiming control function item corresponding to the virtual shooting prop;
controlling the sight star pattern to move towards the target object in response to a targeting operation for the target object triggered based on the targeting control function item.
In the above scheme, the control module is further configured to receive a sliding operation on a picture of the virtual scene, where the sliding operation is used to trigger an aiming operation of the virtual shooting prop on the target object;
And responding to the sliding operation, moving the picture of the virtual scene corresponding to the sight bead pattern so as to control the sight bead pattern to move towards the target object.
In the above scheme, the apparatus further comprises:
the detection module is used for determining an area detection frame corresponding to the target object, and the area detection frame corresponds to the target area;
acquiring corresponding position information of the sight bead pattern in the virtual scene;
when it is determined that the sight bead pattern is located within the area detection frame based on the position information, it is determined that the sight bead pattern moves to the target area.
In the above scheme, the generating module is further configured to obtain a distance between the sight pattern and the target object;
generating a resistance force for the sight bead pattern in the moving direction based on a negative correlation between the distance and the resistance force.
In the above solution, the generating module is further configured to acquire a target moving direction of the sight pattern relative to the target object and an operation speed corresponding to the aiming operation;
generating a resistance force for the sight-star pattern in the moving direction based on the target moving direction and the operating speed.
In the foregoing aspect, the generating module is further configured to generate a resistance to the sight pattern in the moving direction based on a positive correlation between the operating speed and the resistance when the target moving direction is that the sight pattern moves toward the target object;
when the target moving direction is that the sight pattern moves in a direction away from the target object, a resistance force for the sight pattern in the moving direction is generated based on a negative correlation relationship between the operating speed and the resistance force.
In the above solution, the generating module is further configured to acquire a target moving direction of the sight pattern relative to the target object, an operation speed corresponding to the aiming operation, and a distance between the sight pattern and the target object;
generating a resistance to the sight-star pattern in the direction of movement in conjunction with the target direction of movement, the operating speed, and the distance.
In the above solution, the generating module is further configured to generate a basic resistance of the sight pattern in the moving direction based on the target moving direction and the operating speed;
determining a resistance coefficient corresponding to the sight bead pattern based on the distance and the maximum distance between the target object and the boundary of the target area;
Generating a resistance force for the sight bead pattern in a moving direction based on the base resistance force and the resistance coefficient.
In the above solution, the presenting module is further configured to present an auxiliary aiming function item;
in response to an on instruction for the auxiliary aiming function item, adjusting the mode of aiming operation to be an auxiliary aiming mode;
the generation module is further used for generating resistance in the moving direction of the sight pattern when the sight pattern moves to a target area with the target object as the center and the mode of the aiming operation is an auxiliary aiming mode so as to control the virtual shooting prop to aim the target object based on the resistance.
An embodiment of the present invention further provides an electronic device, including:
a memory for storing executable instructions;
and the processor is used for implementing the aiming method of the virtual shooting prop provided by the embodiment of the invention when the executable instructions stored in the memory are executed.
The embodiment of the invention also provides a computer-readable storage medium, which stores executable instructions, and when the executable instructions are executed by a processor, the method for aiming the virtual shooting prop provided by the embodiment of the invention is realized.
The embodiment of the invention has the following beneficial effects:
the method comprises the steps of presenting a virtual shooting prop in a virtual scene and a sight pattern corresponding to the virtual shooting prop, responding to aiming operation when aiming operation aiming at a target object triggered based on the virtual shooting prop is received, controlling the sight pattern to move towards the target object, and generating resistance of the sight pattern in the moving direction when the sight pattern moves to a target area taking the target object as a center so as to control the virtual shooting prop to aim at the target object based on the resistance.
That is to say, in the aiming process for the target object, if the sight pattern moves to the area near the target object, the moving speed of the sight pattern is reduced by generating the resistance of the sight pattern in the moving direction, so that the virtual shooting prop is conveniently controlled to accurately aim the target object, the interaction times required for achieving the interaction purpose are reduced, the human-computer interaction efficiency is improved, and the occupation of hardware processing resources is reduced.
Drawings
Fig. 1 is a schematic diagram of an architecture of a targeting system 100 for virtual shooting props provided by an embodiment of the present invention;
fig. 2 is a schematic structural diagram of an electronic device 400 according to an embodiment of the present invention;
Fig. 3 is a schematic diagram of a man-machine interaction engine installed in a sighting device of a virtual shooting prop according to an embodiment of the present invention;
FIG. 4 is a schematic flow chart of a method for aiming a virtual shooting prop according to an embodiment of the present invention;
FIG. 5 is a schematic diagram illustrating a selection of virtual shooting properties provided by embodiments of the invention;
FIG. 6 is a schematic representation of a virtual shooting prop and a sight pattern provided in accordance with an embodiment of the present invention;
FIG. 7 is a schematic diagram of the aiming operation provided by an embodiment of the present invention;
FIG. 8 is a schematic diagram of the aiming operation provided by an embodiment of the present invention;
FIG. 9 is a schematic diagram of a sight pattern in a target area provided by an embodiment of the invention;
FIG. 10 is a schematic representation of an auxiliary targeting function provided by an embodiment of the present invention;
FIG. 11 is a schematic diagram of the relationship between distance and resistance provided by an embodiment of the present invention;
FIG. 12 is a schematic diagram of the relationship between moving direction, operating speed and resistance provided by an embodiment of the present invention;
FIG. 13 is a schematic flow chart of a method for targeting a virtual shooting prop according to an embodiment of the present invention;
FIG. 14 is a schematic illustration of the target area and resistance range settings provided by an embodiment of the present invention;
FIG. 15 is a schematic diagram of the generation of resistance provided by an embodiment of the present invention;
Fig. 16 is a schematic structural diagram of a sighting device of a virtual shooting prop provided by an embodiment of the invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention clearer, the present invention will be further described in detail with reference to the accompanying drawings, the described embodiments should not be construed as limiting the present invention, and all other embodiments obtained by a person of ordinary skill in the art without creative efforts shall fall within the protection scope of the present invention.
In the following description, reference is made to "some embodiments" which describe a subset of all possible embodiments, but it is understood that "some embodiments" may be the same subset or different subsets of all possible embodiments, and may be combined with each other without conflict.
In the following description, references to the terms "first \ second \ third" are only to distinguish similar objects and do not denote a particular order, but rather the terms "first \ second \ third" are used to interchange specific orders or sequences, where appropriate, to enable embodiments of the invention described herein to be practiced in other than the order shown or described herein.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used herein is for the purpose of describing embodiments of the invention only and is not intended to be limiting of the invention.
Before further detailed description of the embodiments of the present invention, terms and expressions mentioned in the embodiments of the present invention are explained, and the terms and expressions mentioned in the embodiments of the present invention are applied to the following explanations.
1) In response to the condition or state on which the performed operation depends, one or more of the performed operations may be in real-time or may have a set delay when the dependent condition or state is satisfied; there is no restriction on the order of execution of the operations performed unless otherwise specified.
2) The client, an application program running in the terminal for providing various services, such as a video playing client, a game client, etc.
3) The virtual scene is a virtual scene displayed (or provided) when an application program runs on the terminal. The virtual scene may be a simulation environment of a real world, a semi-simulation semi-fictional virtual environment, or a pure fictional virtual environment. The virtual scene may be any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene, or a three-dimensional virtual scene, and the dimension of the virtual scene is not limited in the embodiment of the present invention. For example, a virtual scene may include sky, land, ocean, etc., the land may include environmental elements such as deserts, cities, etc., and a user may control a virtual object to move in the virtual scene.
4) A virtual object, an avatar of various people and things that can interact in the virtual scene, or a movable object in the virtual scene. The movable object can be a virtual character, a virtual animal, an animation character, etc., such as: characters, animals, plants, oil drums, walls, rocks, etc. displayed in the virtual scene. The virtual object may be an avatar in the virtual scene that is virtual to represent the user. The virtual scene may include a plurality of virtual objects, each virtual object having its own shape and volume in the virtual scene and occupying a portion of the space in the virtual scene.
Alternatively, the virtual object may be a user Character controlled by an operation on the client, an Artificial Intelligence (AI) set in the virtual scene fight by training, or a Non-user Character (NPC) set in the virtual scene interaction. Alternatively, the virtual object may be a virtual character that is confrontationally interacted with in a virtual scene. Optionally, the number of virtual objects participating in the interaction in the virtual scene may be preset, or may be dynamically determined according to the number of clients participating in the interaction.
Taking a shooting game as an example, the user may control a virtual object to freely fall, glide or open a parachute to fall in the sky of the virtual scene, run, jump, crawl, bow to move on land, or control a virtual object to swim, float or dive in the sea, or the like, and of course, the user may also control a virtual object to move in the virtual scene by riding a virtual vehicle, for example, the virtual vehicle may be a virtual car, a virtual aircraft, a virtual yacht, or the like, which is only exemplified by the above-mentioned scenes, but the present invention is not limited thereto. The user can also control the virtual object to perform antagonistic interaction with other virtual objects through the virtual prop, for example, the virtual prop may be a throwing-type virtual prop such as a grenade, a beaming mine, a viscous grenade, or a shooting-type virtual prop (i.e., a virtual shooting prop) such as a machine gun, a pistol, or a rifle.
5) Scene data, representing various features that objects in the virtual scene are exposed to during the interaction, may include, for example, the location of the objects in the virtual scene. Of course, different types of features may be included depending on the type of virtual scene; for example, in a virtual scene of a game, scene data may include a time required to wait for various functions provided in the virtual scene (depending on the number of times the same function can be used within a certain time), and attribute values indicating various states of a game character, for example, a life value (also referred to as a red amount) and a magic value (also referred to as a blue amount), and the like.
Based on the above explanations of terms and terms involved in the embodiments of the present invention, the following describes the targeting system 100 of the virtual shooting prop provided in the embodiments of the present invention, referring to fig. 1, fig. 1 is a schematic structural diagram of the targeting system 100 of the virtual shooting prop provided in the embodiments of the present invention, in order to support an exemplary application, a terminal (including a terminal 400-1 and a terminal 400-2) is connected to a server 200 through a network 300, and the network 300 may be a wide area network or a local area network, or a combination of both networks, and uses a wireless or wired link to implement data transmission.
The terminal (including the terminal 400-1 and the terminal 400-2) is used for sending an acquisition request of scene data of the virtual scene to the server 200 based on the view interface receiving the triggering operation of entering the virtual scene;
the server 200 is configured to receive an acquisition request of scene data, and return the scene data of a virtual scene to the terminal in response to the acquisition request;
terminals (including the terminal 400-1 and the terminal 400-2) for receiving scene data of a virtual scene, rendering a picture of the virtual scene based on the scene data, and presenting the picture of the virtual scene on a graphical interface (the graphical interface 410-1 and the graphical interface 410-2 are exemplarily shown); the virtual scene can also present an object interaction environment, an interaction object and the like in the picture of the virtual scene, and the content presented by the picture of the virtual scene is obtained by rendering based on the returned scene data of the virtual field.
Specifically, the terminal presents a virtual shooting prop in the virtual scene and a sight pattern corresponding to the virtual shooting prop in the virtual scene in a picture of the virtual scene; controlling the sight bead pattern to move towards the target object in response to the aiming operation for the target object triggered based on the virtual shooting prop; when the sight-star pattern moves to a target area centered on the target object, a resistance against the sight-star pattern in the moving direction is generated to control the virtual shooting prop to aim the target object based on the resistance.
In practical application, the server 200 may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as a cloud service, a cloud database, cloud computing, a cloud function, cloud storage, a network service, cloud communication, a middleware service, a domain name service, a security service, a CDN, a big data and artificial intelligence platform, and the like. The terminals (including the terminal 400-1 and the terminal 400-2) may be, but are not limited to, a smart phone, a tablet computer, a notebook computer, a desktop computer, a smart speaker, a smart watch, and the like. The terminal and the server may be directly or indirectly connected through wired or wireless communication, and the present invention is not limited thereto.
In actual applications, the terminals (including the terminal 400-1 and the terminal 400-2) are installed and run with applications supporting virtual scenes. The application program may be any one of a First-Person shooter game (FPS), a third-Person shooter game, a Multi-player Online Battle sports game (MOBA), a Two-dimensional (2D) game application, a Three-dimensional (3D) game application, a virtual reality application program, a Three-dimensional map program, a military simulation program, or a Multi-player gunfight survival game. The application may also be a stand-alone application, such as a stand-alone 3D game program.
The virtual scene involved in the embodiment of the invention can be used for simulating a two-dimensional virtual space or a three-dimensional virtual space and the like. Taking the example that the virtual scene simulates a three-dimensional virtual space, which may be an open space, the virtual scene may be used to simulate a real environment in reality, for example, the virtual scene may include sky, land, sea, and the like, and the land may include environmental elements such as a desert, a city, and the like. Of course, the virtual scene may also include virtual objects, such as buildings, vehicles, and props for arming themselves or weapons required for fighting with other virtual objects. The virtual scene can also be used for simulating real environments in different weathers, such as sunny days, rainy days, foggy days or nights. The virtual object may be an avatar in the virtual scene for representing the user, and the avatar may be in any form, such as a simulated character, a simulated animal, and the like, which is not limited by the invention. In actual implementation, a user may use a terminal (such as terminal 400-1) to control a virtual object to perform activities in the virtual scene, including but not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, picking, shooting, attacking, throwing.
The method comprises the steps that an electronic game scene is taken as an exemplary scene, a user can operate on a terminal in advance, the terminal can download a game configuration file of the electronic game after detecting the operation of the user, the game configuration file can comprise an application program, interface display data or virtual scene data and the like of the electronic game, and therefore the user can call the game configuration file when logging in the electronic game on the terminal and render and display an electronic game interface. A user may perform a touch operation on a terminal, and after the terminal detects the touch operation, the terminal may determine game data corresponding to the touch operation, and render and display the game data, where the game data may include virtual scene data, behavior data of a virtual object in the virtual scene, and the like.
In practical application, a terminal (including the terminal 400-1 and the terminal 400-2) receives a trigger operation for entering a virtual scene based on a view interface, and sends an acquisition request of scene data of the virtual scene to the server 200; the server 200 receives an acquisition request of scene data, responds to the acquisition request, and returns the scene data of the virtual scene to the terminal; the terminal receives scene data of the virtual scene, renders pictures of the virtual scene based on the scene data, and presents the pictures of the virtual scene;
Further, the terminal presents the virtual shooting prop in the virtual scene and the sight pattern corresponding to the virtual shooting prop in the virtual scene in the picture of the virtual scene; in response to the aiming operation for the target object triggered based on the virtual shooting prop, controlling the sight star pattern to move towards the target object (such as an avatar corresponding to other game users or non-user characters in an electronic game scene); when the sight pattern moves to a target area with the target object as the center, resistance in the moving direction of the sight pattern is generated, and the virtual shooting prop is controlled to aim the target object based on the resistance.
The virtual simulation application of military is taken as an exemplary scene, the virtual scene technology is adopted to enable a trainee to experience a battlefield environment in a real way in vision and hearing and to be familiar with the environmental characteristics of a to-be-battle area, necessary equipment is interacted with an object in the virtual environment, and the implementation method of the virtual battlefield environment can create a three-dimensional battlefield environment which is a dangerous image ring life and is almost real through background generation and image synthesis through a corresponding three-dimensional battlefield environment graphic image library comprising a battle background, a battlefield scene, various weaponry, fighters and the like.
In actual implementation, the terminal (including the terminal 400-1 and the terminal 400-2) sends an acquisition request of scene data of a virtual scene to the server 200 based on a trigger operation of entering the virtual scene received by the view interface; the server 200 receives an acquisition request of scene data, responds to the acquisition request, and returns the scene data of the virtual scene to the terminal; the terminal receives scene data of the virtual scene, renders pictures of the virtual scene based on the scene data, and presents the pictures of the virtual scene;
further, the terminal presents the virtual shooting props in the virtual scene and the sight patterns corresponding to the virtual shooting props in the virtual scene in the picture of the virtual scene; controlling the movement of the sight bead pattern towards a target object (such as enemy simulation fighters in a military virtual simulation scene) in response to aiming operation for the target object triggered based on the virtual shooting prop; when the sight-star pattern moves to a target area centered on the target object, a resistance against the sight-star pattern in the moving direction is generated to control the virtual shooting prop to aim the target object based on the resistance.
The hardware structure of the electronic device of the method for aiming a virtual shooting prop according to the embodiment of the present invention is described in detail below, where the electronic device includes, but is not limited to, a server or a terminal, and for example, the electronic device may be a terminal (including terminal 400-1 and terminal 400-2) in fig. 1. Referring to fig. 2, fig. 2 is a schematic structural diagram of an electronic device according to an embodiment of the present invention, where the electronic device 400 shown in fig. 2 includes: at least one processor 410, memory 450, at least one network interface 420, and a user interface 430. The various components in electronic device 400 are coupled together by a bus system 440. It is understood that the bus system 440 is used to enable communications among the components. The bus system 440 includes a power bus, a control bus, and a status signal bus in addition to a data bus. For clarity of illustration, however, the various buses are labeled as bus system 440 in fig. 2.
The Processor 410 may be an integrated circuit chip having Signal processing capabilities, such as a general purpose Processor, a Digital Signal Processor (DSP), or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like, wherein the general purpose Processor may be a microprocessor or any conventional Processor, or the like.
The user interface 430 includes one or more output devices 431, including one or more speakers and/or one or more visual displays, that enable the presentation of media content. The user interface 430 also includes one or more input devices 432, including user interface components that facilitate user input, such as a keyboard, mouse, microphone, touch screen display, camera, other input buttons and controls.
The memory 450 may be removable, non-removable, or a combination thereof. Exemplary hardware devices include solid state memory, hard disk drives, optical disk drives, and the like. Memory 450 optionally includes one or more storage devices physically located remote from processor 410.
The memory 450 includes both volatile memory and nonvolatile memory, and can include both volatile and nonvolatile memory. The nonvolatile memory may be a Read Only Memory (ROM), and the volatile memory may be a Random Access Memory (RAM). The memory 450 described in embodiments of the invention is intended to comprise any suitable type of memory.
In some embodiments, memory 450 is capable of storing data, examples of which include programs, modules, and data structures, or a subset or superset thereof, to support various operations, as exemplified below.
An operating system 451, including system programs for handling various basic system services and performing hardware-related tasks, such as a framework layer, a core library layer, a driver layer, etc., for implementing various basic services and handling hardware-based tasks;
a network communication module 452 for communicating to other computing devices via one or more (wired or wireless) network interfaces 420, exemplary network interfaces 420 including: bluetooth, wireless compatibility authentication (WiFi), and Universal Serial Bus (USB), etc.;
a presentation module 453 for enabling presentation of information (e.g., user interfaces for operating peripherals and displaying content and information) via one or more output devices 431 (e.g., display screens, speakers, etc.) associated with user interface 430;
an input processing module 454 for detecting one or more user inputs or interactions from one of the one or more input devices 432 and translating the detected inputs or interactions.
In some embodiments, the aiming device of the virtual shooting prop provided by the embodiments of the present invention may be implemented in software, and fig. 2 shows the aiming device 455 of the virtual shooting prop stored in the memory 450, which may be software in the form of programs and plug-ins, and includes the following software modules: a presentation module 4551, a control module 4552 and a generation module 4553, which are logical and thus may be arbitrarily combined or further divided according to the functions implemented, and the functions of the respective modules will be described hereinafter.
In some embodiments, a human-machine interaction engine for implementing the aiming method of the virtual shooting item is installed in the aiming device 455 of the virtual shooting item, and the human-machine interaction engine includes a functional module, a component, or a plug-in for implementing the aiming method of the virtual shooting item, fig. 3 is a schematic diagram of the human-machine interaction engine installed in the aiming device of the virtual shooting item according to the embodiment of the present invention, and referring to fig. 3, a virtual scene is taken as an example of a game scene, and accordingly, the human-machine interaction engine is a game engine.
The game engine is a code (instruction) set which is designed for a machine running a certain kind of game and can be identified by the machine, and is like an engine and controls the running of the game, a game program can be divided into two parts of the game engine and game resources, the game resources comprise images, sounds, animation and the like, the game is divided into the engine (program code) + resources (images, sounds, animation and the like), and the game engine calls the resources in sequence according to the requirements of the game design.
The method for aiming the virtual shooting prop provided by the embodiment of the invention can be realized by each module in the aiming device of the virtual shooting prop shown in fig. 2 by calling the relevant module, component or plug-in of the game engine shown in fig. 3, and the module, component or plug-in included in the game engine shown in fig. 3 is described in an exemplary manner.
As shown in FIG. 3, the scene organization is used to manage the entire game world so that game applications can more efficiently handle scene updates and events; the rendering module is used for rendering two-dimensional and three-dimensional graphics, processing light and shadow effects, rendering materials and the like for models, scenes and the like; the bottom layer algorithm module is used for processing logic in the game, is responsible for the reaction of the role to the event, the realization of a complex intelligent algorithm and the like; the editor component is an auxiliary development tool provided for game development, and comprises auxiliary management tools such as a scene editor, a model editor, an animation editor, a logic editor and a special effect editor; the User Interface (UI) component is responsible for interaction between a User and a system and is used for displaying a picture of a virtual scene obtained after the rendering component realizes model rendering and scene rendering; the skeleton animation component is used for managing key frame animation and skeleton animation which are similar to skeletons and drive objects to move, and enriches roles to ensure that the roles are more vivid; the model plug-in and the model manage the model in the game; the terrain management module manages the terrain, paths and the like in the game world, so that the game is more vivid; the special effect component is responsible for simulating various natural phenomena in real time in the game world, so that the game is more gorgeous and the like.
For example, the presentation module 4551 may implement interaction between a user and a game by calling a user interface part in the game engine shown in fig. 3, create a two-dimensional or three-dimensional model by calling a model part in the game engine, and after the model is created, assign a material chartlet to the model according to different surfaces through a bone animation part, which is equivalent to covering a skin on a bone, and finally calculate and display all effects of the model, animation, light and shadow, special effects, and the like on a human-computer interaction interface in real time through a rendering part. Specifically, the presentation module 4551 may present the virtual shooting prop and the sight pattern corresponding to the virtual shooting prop in the virtual scene after the rendering part in the game engine shown in fig. 3 is called to render the virtual scene data.
The control module 4552 may control the movement of the sight pattern towards the target object when receiving a targeting operation triggered based on the virtual shooting prop by calling a camera part and a scene organization part in the game engine shown in fig. 3, and at the same time, call a rendering part in the game engine shown in fig. 3 to perform real-time image calculation based on the movement track of the sight pattern and display the image on the human-computer interaction interface.
The generating module 4553 may detect the position of the sight bead pattern by calling a camera part and a scene organization part in the game engine shown in fig. 3, and call a bottom-layer algorithm part and an editor part to determine whether the sight bead pattern moves to a target area centered on a target object according to the detection result; when the sight-star pattern moves to the target area, the underlying algorithm part and the editor part are called again to generate resistance aiming at the sight-star pattern in the moving direction so as to control the virtual shooting prop to aim at the target object based on the resistance.
Based on the above description of the system for aiming a virtual shooting prop and the electronic device in the embodiment of the present invention, the following description describes an aiming method for a virtual shooting prop in the embodiment of the present invention. Referring to fig. 4, fig. 4 is a schematic flow chart of a method for aiming a virtual shooting prop according to an embodiment of the present invention; in some embodiments, the method for aiming the virtual shooting prop may be implemented by a server or a terminal alone, or implemented by the server and the terminal in a cooperative manner, taking the terminal as an example, the method for aiming the virtual shooting prop provided in the embodiments of the present invention includes:
step 101: and the terminal presents the virtual shooting props in the virtual scene and the sight bead patterns corresponding to the virtual shooting props.
Here, the terminal is installed with an application client supporting a virtual scene, and when a user opens the application client on the terminal and the terminal runs the application client, the terminal presents a picture of the virtual scene (such as a shooting game scene), which may be a two-dimensional virtual scene or a three-dimensional virtual scene. The picture of the virtual scene may be obtained by observing the virtual scene from the viewing angle of a first person virtual object or from the viewing angle of a third person virtual object, where the virtual object is a virtual image in the virtual scene corresponding to the current user account. In the virtual scene, a user may control a virtual object to perform an action through a picture (such as an object interaction interface) of the virtual scene, specifically, the virtual object may hold a virtual prop, which may be any prop used when the virtual object interacts with other virtual objects, for example, a virtual shooting prop, a virtual bow, a virtual slingshot, a virtual nunchakus, a virtual whip, and the like, and the user may control the virtual object to interact with other virtual objects based on the picture of the virtual scene displayed by the terminal.
In the embodiment of the invention, the terminal presents the virtual shooting prop corresponding to the virtual object through the picture of the virtual scene, and presents the sight pattern corresponding to the virtual shooting prop. The aiming direction of the sight bead pattern is the shooting direction of a virtual camera (equivalent to the eyes of a user, which shoots a virtual scene to obtain a scene picture and is wholly or partially presented in the picture of the virtual scene) of the virtual scene, and the aiming direction is used for indicating the sight line direction of the user, so that the user can conveniently control the virtual shooting prop to aim at a target object.
In some embodiments, the terminal further presents a selection interface comprising at least two candidate virtual shooting props prior to presenting the virtual shooting props and the corresponding sight patterns; and receiving a selection operation aiming at the candidate virtual shooting prop triggered based on the selection interface, and taking the selected candidate virtual shooting prop as the virtual shooting prop.
In practical application, before the terminal presents the picture of the virtual scene or in the process of presenting the picture of the virtual scene, the terminal can present a selection interface for selecting the virtual shooting props, at least two candidate virtual shooting props are presented in the selection interface, and in practical application, presentation of the candidate virtual shooting props can be realized through icons corresponding to the virtual shooting props. The selection interface can be a picture occupying the whole view interface of the terminal, or can be a picture occupying the terminal view interface, for example, the selection interface can be suspended on the picture of the virtual scene. When the user triggers the selection operation aiming at the at least two candidate virtual shooting props based on the selection interface, the terminal receives and responds to the selection operation, and the selected candidate virtual shooting props are determined as virtual shooting props.
Here, when the user triggers a selection operation for a virtual firing prop based on the selection interface, the selected virtual firing prop may be displayed in the selection interface in a target display style such that the display style of the selected virtual firing prop is different from the display styles of the non-selected candidate virtual firing props, e.g., the selected virtual firing prop is highlighted in the selection interface and the other non-selected candidate virtual firing props are not highlighted in the selection interface.
Exemplarily, referring to fig. 5, fig. 5 is a schematic diagram illustrating selection of a virtual shooting prop according to an embodiment of the present invention. Here, the terminal is presented with selection interface a0 floating over the screen of the virtual scene and with 4 candidate virtual shooting properties B1-B4, and when a selection operation for candidate virtual shooting property B2 is received, candidate virtual shooting property B2 is determined as a virtual shooting property.
In some embodiments, the terminal may present the virtual shooting props and corresponding sight patterns by: presenting an operation control of the virtual shooting prop in an interface of a virtual scene; and when the operation control is in an activated state, responding to the trigger operation aiming at the operation control, and presenting the virtual shooting prop and the sight pattern corresponding to the virtual shooting prop.
Here, in general, the selection and use of a virtual shooting prop needs to be performed when the corresponding operating control is in an activated state. Specifically, the terminal can display the operation control of the virtual shooting prop through the target style, so that the display style of the operation control of the virtual shooting prop in the activated state is different from the display style of the operation control of the virtual shooting prop in the inactivated state. Referring to fig. 6, fig. 6 is a schematic representation of a virtual shooting prop and a sight pattern provided in an embodiment of the present invention. Here, the display modes of the operation controls B1, B3, and B4 of the virtual shooting props in the inactive state in the virtual prop list are gray-scale display, and the display mode of the operation controls B2 in the active state in the virtual shooting prop list is highlight display.
At this time, the user may select the virtual shooting prop in an activated state based on the screen of the virtual scene. Specifically, the terminal presents an operation control of the virtual shooting prop in a picture of a virtual scene, and when the operation control of the virtual shooting prop is in an activated state, the virtual shooting prop and a sight pattern corresponding to the virtual shooting prop are presented in response to the triggering operation of a user on the operation control. With continued reference to fig. 6, the terminal presents the operation control B2 of the virtual shooting prop in a highlighted manner (i.e., the operation control B2 of the virtual shooting prop is in an activated state), receives a trigger operation of the operation control B2 for the virtual shooting prop, and presents the virtual shooting prop and a sight pattern corresponding to the virtual shooting prop.
Step 102: and controlling the sight bead pattern to move towards the target object in response to the aiming operation triggered by the virtual shooting prop for the target object.
Here, after the terminal presents the virtual shooting prop and the sight pattern corresponding to the virtual shooting prop, the state at this time is the idle state of the virtual object, and the user can operate the virtual shooting prop to realize interaction with other virtual objects by controlling the virtual object, for example, the user can aim the target object by controlling the virtual object to attack the target object. Specifically, when an aiming operation for a target object triggered based on the virtual shooting prop is received, the aiming operation is responded, the sight pattern is controlled to move towards the target object to realize an aiming function, and the moving process of the sight pattern can be presented.
In some embodiments, the terminal may control the movement of the sight-star pattern towards the target object by: presenting aiming control function items corresponding to the virtual shooting props; the method includes controlling the sight-star pattern to move toward the target object in response to a targeting operation for the target object triggered based on the targeting control function item.
In practical application, the terminal may further present an aiming control function item corresponding to the virtual shooting prop, where the aiming control function item may be a function button or a function icon. The user can trigger the aiming operation aiming at the target object through the aiming control function item. Illustratively, referring to fig. 7, fig. 7 is a schematic diagram of the targeting operation provided by an embodiment of the present invention. Here, the aiming control function item is a function button presented by an icon, and a user can trigger an aiming operation for a target object by controlling (e.g., long-pressing and sliding) the function button.
In some embodiments, the terminal may control the movement of the sight-star pattern towards the target object by: receiving a sliding operation of a picture for a virtual scene; and responding to a sliding operation, moving a picture of a virtual scene corresponding to the sight pattern to control the sight pattern to move towards the target object, wherein the sliding operation is used for triggering the aiming operation of the virtual shooting prop for the target object.
In practical application, a user can realize aiming operation aiming at a target object through a picture of a virtual scene presented by a sliding view interface. When the terminal receives the sliding operation of the user on the picture of the virtual scene, the aiming operation of the virtual shooting prop on the target object is triggered. And the terminal responds to the sliding operation, and moves the picture of the virtual scene corresponding to the sight pattern so as to move the ground sight pattern to the target object. Illustratively, referring to fig. 8, fig. 8 is a schematic diagram of a targeting operation provided by an embodiment of the present invention. Here, when a user's sliding operation on the screen of the virtual scene is received, the screen corresponding to the foresight pattern is moved, for example, the screen corresponding to the foresight pattern is moved from the area (1) to the area (2), so as to control the foresight pattern to move towards the target object.
Step 103: when the sight-star pattern moves to a target area centered on the target object, a resistance against the sight-star pattern in the moving direction is generated to control the virtual shooting prop to aim the target object based on the resistance.
Here, in the embodiment of the present invention, to help a user achieve accurate aiming of a target object, the position of the alignment star pattern relative to the target object is detected. Specifically, a target area centered on a target object may be set in advance for the target object, and the target area may be a rectangular, square, circular, or the like area centered on the target object.
In some embodiments, the terminal may detect whether the star pattern moves to a target area centered on the target object by: determining an area detection frame corresponding to the target object, wherein the area detection frame corresponds to the target area; acquiring corresponding position information of the sight bead pattern in the virtual scene; when it is determined that the sight bead pattern is located within the area detection frame based on the position information, it is determined that the sight bead pattern moves to the target area.
In practical application, the terminal can acquire the corresponding position information of the sight bead pattern in the virtual scene,
When the sight bead pattern is detected to move to the target area based on the aiming operation, the aiming position of the user is considered to be close to the target object, and at the moment, resistance aiming at the sight bead pattern in the moving direction is generated so as to assist the user to control the virtual shooting prop to aim the target object. Referring to fig. 9, fig. 9 is a schematic diagram of a sight pattern in a target area according to an embodiment of the present invention. Here, the sight pattern is moved into a target area, which is a rectangular area centered on the target object, based on the aiming operation.
In the embodiment of the present invention, the resistance is expressed as a decrease in sensitivity when the user slides the screen to aim at the target object, that is, when the sight-star pattern is located within the target area, the user uses the same sliding distance as when the sight-star pattern is located outside the target area, but the deflection of the angle of view in the virtual scene is lower than when the sight-star pattern is located outside the target area.
In some embodiments, the terminal may present an auxiliary aiming function item; in response to an opening instruction for the auxiliary aiming function item, adjusting the mode of aiming operation to be an auxiliary aiming mode;
accordingly, when the terminal detects that the sight pattern moves to the target area with the target object as the center and the mode of the aiming operation is the auxiliary aiming mode, the terminal generates resistance aiming at the sight pattern in the moving direction so as to control the virtual shooting prop to aim the target object based on the resistance.
In practical applications, a corresponding auxiliary aiming function item can be provided for the virtual scene, and the auxiliary aiming function item is used for enabling or disabling an auxiliary aiming mode by a user. The auxiliary aiming mode is used for generating resistance aiming at the sight pattern in the moving direction when the sight pattern moves to a target area taking the target object as the center so as to control the virtual shooting prop to aim the target object based on the resistance.
When the terminal receives a starting instruction which is triggered by a user and aims at the auxiliary aiming function item, the mode of the aiming operation is adjusted to be an auxiliary aiming mode. At this time, if the terminal detects that the sight bead pattern moves to the target area, resistance in the moving direction for the sight bead pattern is generated, so that the virtual shooting prop is controlled to aim at the target object based on the resistance. Referring to fig. 10, fig. 10 is a schematic representation diagram of an auxiliary aiming function item provided by the embodiment of the present invention. Here, the terminal presents a function item of "auxiliary aiming" in a screen of the virtual scene, and the user can turn on or off the auxiliary aiming mode through the function item. In other embodiments, the terminal may also be presented in a setting interface of a virtual scene, which is not limited in the embodiments of the present invention.
In some embodiments, the terminal may generate a resistance to the sight-star pattern in the direction of movement by: acquiring the distance between the sight bead pattern and a target object; based on the negative correlation between the distance and the resistance, a resistance in the moving direction for the sight-star pattern is generated.
In practical applications, the resistance may be constant, i.e. when proceeding from the sight pattern to the target area, the resistance is increased by a constant value in the moving direction for the sight pattern; in addition, the resistance may also be varied, for example, according to the distance between the sight-star pattern and the target object, so that the user does not experience sudden changes in aiming but feels the existence of stiction smoothly through the gradual change of the resistance.
When the resistance is not a constant value, the terminal may generate a resistance of a target size according to a change in distance between the sight-star pattern and the target object. Specifically, the terminal acquires the distance between the sight pattern and the target object, and generates the resistance to the sight pattern in the moving direction based on the negative correlation between the distance and the resistance. That is, the larger the distance between the sight-star pattern and the target object, the smaller the generated resistance; the smaller the distance between the sight-star pattern and the target object, the greater the resistance generated. Referring to fig. 11, fig. 11 is a schematic diagram of the relationship between the distance and the resistance provided by the embodiment of the present invention. Here, as the distance between the sight-star pattern and the target object becomes smaller, the generated resistance becomes larger.
In practical implementation, the target area may be further divided into a plurality of area units, and a corresponding resistance value is set for each area unit, and a difference value between the resistance values of adjacent area units is lower than a difference threshold value, so that according to a position of the quasi-star pattern in the target area, a resistance value of the area unit corresponding to the position is generated, so as to generate a resistance value for the quasi-star pattern in the moving direction.
In some embodiments, the terminal may generate a resistance to the sight-star pattern in the direction of movement by: acquiring a target moving direction of the sight pattern relative to a target object and an operation speed corresponding to aiming operation; based on the target moving direction and the operating speed, a resistance to the sight-star pattern in the moving direction is generated.
In some embodiments, the terminal may generate a resistance to the sight-star pattern in the direction of movement based on the target direction of movement and the operating speed by: when the target moving direction is that the sight-aiming pattern moves towards the direction close to the target object, generating resistance aiming at the sight-aiming pattern in the moving direction based on the positive correlation relation between the operation speed and the resistance; when the target moving direction is a direction in which the sight pattern moves away from the target object, a resistance against the sight pattern in the moving direction is generated based on a negative correlation relationship between the operation speed and the resistance.
In practical applications, when a user performs a screen sliding operation to achieve aiming, in order to avoid that a strong friction force is also sensed when the user does not aim at the target, or the resistance force is also resisted by the friction force when the user intends to move a visual angle away from the target, the embodiment of the invention provides a method for changing the magnitude of the resistance force by gesture judgment. When the screen sliding direction of the user (i.e. the target moving direction of the sight pattern relative to the target object) is moving towards the direction close to the target object, the positive correlation between the operation speed and the resistance is formed, i.e. the faster the gesture sliding of the user (i.e. the operation speed corresponding to the aiming operation) is, the greater the resistance is, so as to avoid that the user quickly slides over the target object. When the screen sliding direction of the user (i.e. the target moving direction of the sight pattern relative to the target object) moves towards the direction far away from the target object, the operation speed and the resistance are in a negative correlation relationship, i.e. the faster the gesture sliding of the user (i.e. the operation speed corresponding to the aiming operation) is, the smaller the resistance is, so as to avoid hindering the user from controlling the virtual shooting prop to move the sight pattern.
Referring to fig. 12, fig. 12 is a schematic diagram of the relationship between the moving direction, the operating speed and the resistance provided by the embodiment of the invention. Here, as the sight pattern moves toward the direction of approaching the target object, the generated resistance becomes larger as the operation speed increases; when the sight-star pattern moves in a direction away from the target object, the generated resistance becomes smaller as the operation speed increases.
In some embodiments, the terminal may generate a resistance to the sight-star pattern in the direction of movement by: acquiring a target moving direction of the sight pattern relative to the target object, an operation speed corresponding to aiming operation and a distance between the sight pattern and the target object; in conjunction with the target movement direction, the operating speed, and the distance, a resistance to the sight-star pattern in the movement direction is generated.
In some embodiments, the terminal may generate a resistance to the sight-star pattern in the direction of movement by combining the target direction of movement, the operating speed, and the distance as follows: generating a basic resistance of the sight bead pattern in the moving direction based on the target moving direction and the operation speed; determining a resistance coefficient corresponding to the sight bead pattern based on the distance and the maximum distance between the target object and the boundary of the target area; based on the base resistance and the resistance coefficient, a resistance in the moving direction for the sight-star pattern is generated.
In practical applications, the terminal may also detect the sight pattern in the target area in real time, such as the moving direction of the sight pattern, the operation speed of the user for controlling the movement of the sight pattern, and the distance between the sight pattern and the target object, so as to generate and output the resistance based on the detection result.
Specifically, the terminal may first generate a base resistance of the sight bead pattern in the moving direction according to the target moving direction and the operation speed; then determining a resistance coefficient corresponding to the sight pattern according to the distance between the sight pattern and the target object and the maximum distance between the target object and the boundary of the target area, specifically, determining a corresponding resistance coefficient according to the proportional relation between the distance and the maximum distance; and finally multiplying the basic resistance and the resistance coefficient to obtain the resistance of the sight bead pattern, and generating the resistance of the sight bead pattern in the moving direction based on the resistance.
By applying the embodiment of the invention, the virtual shooting prop in the virtual scene and the sight pattern corresponding to the virtual shooting prop are presented, when the aiming operation aiming at the target object triggered based on the virtual shooting prop is received, the sight pattern is controlled to move towards the target object in response to the aiming operation, when the sight pattern moves to the target area taking the target object as the center, the resistance of the sight pattern in the moving direction is generated, and therefore, the virtual shooting prop is controlled to aim at the target object based on the resistance. That is to say, in the aiming process for the target object, if the sight pattern moves to the area near the target object, the moving speed of the sight pattern is reduced by generating the resistance of the sight pattern in the moving direction, so that the virtual shooting prop is conveniently controlled to accurately aim the target object, the interaction times required for achieving the interaction purpose are reduced, the human-computer interaction efficiency is improved, and the occupation of hardware processing resources is reduced.
An exemplary application of the embodiments of the present invention in a practical application scenario will be described below.
In most current shooting games, auxiliary aiming is a very important function in aiming and is also an important factor influencing aiming hand feeling. In the related technology, when a user controls a virtual shooting prop to aim an interactive object, accurate aiming of the interactive object is difficult to achieve, man-machine interaction operations such as screen sliding and moving are often required to be executed for many times, aiming effect is still not ideal, and user experience in a virtual scene is greatly influenced.
Based on this, an embodiment of the present invention provides a method for aiming a virtual shooting prop, and referring to fig. 13, fig. 13 is a schematic flow chart of the method for aiming a virtual shooting prop provided in the embodiment of the present invention, including:
step 201: the terminal receives a triggering operation of entering the virtual scene based on the view interface, and sends an acquisition request of scene data of the virtual scene to the server.
Step 202: the server receives the acquisition request of the scene data, responds to the acquisition request, and returns the scene data of the virtual scene to the terminal.
Step 203: the terminal receives scene data of the virtual scene, renders pictures of the virtual scene based on the scene data, presents the pictures of the virtual scene, and presents operation controls of the virtual shooting prop.
Step 204: and when the operation control is in an activated state, responding to the trigger operation aiming at the operation control, and presenting the virtual shooting prop in the virtual scene and the sight pattern corresponding to the virtual shooting prop.
Here, the terminal is provided with a client, such as a game client, which, by running the game client, enters a screen of a virtual scene of the game (such as a shooting game scene), and presents an operation control of the virtual shooting prop in the screen. The picture of the virtual scene is obtained by observing the virtual scene from the visual angle of a virtual object, and the virtual object is an avatar in the virtual scene corresponding to the user who logs in the game client.
When the operation control of the virtual shooting prop is in an activated state, the user can trigger the operation control through operations such as clicking. And the terminal receives the trigger operation of the user for the operation control, responds to the trigger operation, and presents the virtual shooting prop in the virtual scene and the sight pattern corresponding to the virtual shooting prop.
Step 205: and controlling the sight bead pattern to move towards the target object in response to the aiming operation triggered by the virtual shooting prop for the target object.
In practical application, a terminal receives sliding operation of a picture aiming at a virtual scene; and responding to the sliding operation, moving the picture of the virtual scene corresponding to the sight pattern so as to control the sight pattern to move towards the target object. The sliding operation is used for triggering the aiming operation of the virtual shooting prop for the target object.
Here, the user may implement the aiming operation of the target object through the screen of the virtual scene presented by the sliding view interface. The terminal receives sliding operation of a user on the picture of the virtual scene, and moves the picture of the virtual scene corresponding to the sight pattern, so that the sight pattern moves towards the target object, and aiming operation on the target object is achieved.
Step 206: detecting whether the sight pattern moves to a target area with the target object as the center, if so, executing step 207; if not, return to step 205.
Step 207: a resistance to the sight-star pattern in the direction of movement is generated.
Here, in the embodiment of the present invention, to help a user achieve accurate aiming of a target object, the position of the alignment star pattern relative to the target object is detected. When it is detected that the sight-star pattern moves to the target area (i.e., the friction area) based on the aiming operation, the aiming position of the user is considered to be close to the vicinity of the target object, see fig. 9, where fig. 9 is a schematic diagram of the sight-star pattern provided by the embodiment of the present invention in the target area.
At this time, a resistance (i.e., a frictional force) needs to be generated in the moving direction of the sight pattern. In actual practice, this resistance appears to the user as a decrease in sensitivity when the user slides the screen to achieve aim of the target object, i.e. when the sight-star pattern is within the target area, the user slides the same distance as the sight-star pattern is outside the target area, but the deflection of the viewing angle in the virtual scene is lower than outside the target area.
Referring to fig. 14, fig. 14 is a schematic diagram of the target area and the resistance range provided by the embodiment of the invention. Here, when generating the resistance force in the moving direction with respect to the sight-mark pattern, first, a target region (i.e., a generating range) of the resistance force is set, the target region may be a rectangular, square, circular, or the like region centered on the target object, and the set value (for example, a value 60 shown in fig. 14) may be the radius of the circular target region, the side length of the square target region, or the like, or may be the area of the target region.
The resistance may not be a constant value when the resistance is effective in the target area. Therefore, the maximum and minimum values of the resistance are set to realize the gradual change of the resistance from the boundary of the target area to the center (i.e., from the outside to the inside), as shown in fig. 14, by setting the maximum value (1) and the minimum value (0.3) of the moving speed of the sight-star pattern after passing through the resistance.
In practical application, the distance between the sight bead pattern and the target object can be acquired; based on the negative correlation between the distance and the resistance, a resistance in the moving direction for the sight-star pattern is generated. Specifically, in the embodiment of the present invention, the distance and the resistance have a negative correlation relationship, that is, the closer to the target object, the higher the resistance, and the farther from the target object, the lower the resistance. As shown in fig. 11, fig. 11 is a schematic diagram of the relationship between the distance and the resistance provided by the embodiment of the present invention. The gradual change of the resistance does not cause sudden change in the user experience during aiming, but is smooth to feel the existence of the stiction.
In practical application, the target moving direction of the sight pattern relative to the target object and the operation speed corresponding to the aiming operation can be obtained; based on the target moving direction and the operating speed, a resistance to the sight-star pattern in the moving direction is generated. Specifically, when the target moving direction is the direction in which the sight-mark pattern moves closer to the target object, the resistance to the sight-mark pattern in the moving direction is generated based on the positive correlation between the operating speed and the resistance; when the target moving direction is a direction in which the sight pattern moves away from the target object, a resistance against the sight pattern in the moving direction is generated based on a negative correlation relationship between the operation speed and the resistance.
In practical implementation, when a user performs a screen sliding operation, in order to avoid that a strong friction force is felt when the user does not aim at the target, or the friction force is also used for resisting when the player intends to move the visual angle away from the target, the embodiment of the invention provides a method for changing the resistance by gesture judgment. When the user's screen sliding direction (i.e. the target moving direction of the sight-star pattern relative to the target object) is moving closer to the target object, the resistance gradually increases, and the faster the user's gesture slides (i.e. the operation speed corresponding to the aiming operation), the greater the resistance is, so as to avoid the user from quickly crossing. When the screen sliding direction (i.e. the target moving direction of the sight pattern relative to the target object) of the user moves away from the target object, the resistance gradually decreases, and the faster the gesture sliding (i.e. the operation speed corresponding to the aiming operation) of the user is, the smaller the resistance is, so as to avoid hindering the user from controlling the virtual shooting prop to move. As shown in fig. 12, fig. 12 is a schematic diagram of the relationship between the moving direction, the operating speed and the resistance provided by the embodiment of the present invention.
Step 208: and controlling the virtual shooting prop to aim at the target object based on the resistance.
Referring next to fig. 15, fig. 15 is a schematic diagram of the generation of resistance provided by an embodiment of the present invention. Here, whether the sight pattern is within the target area, the moving direction of the sight pattern, the operation speed at which the user controls the movement of the sight pattern, and the distance between the sight pattern and the target object may be detected in real time to perform the generation and output of the resistance based on the detection result.
Step 301: and the user starts to slide the screen, and the aiming operation aiming at the target object is triggered through the screen sliding operation.
Step 302: whether the sight pattern moves to the target area is detected, if so, step 303 is executed, and if not, step 310 is executed.
Here, the target area is centered on the target object.
Step 303: acquiring the target moving direction of the sight pattern relative to the target object, and determining whether the sight pattern moves towards the direction close to the target object, if so, executing step 304, otherwise, executing step 307.
Step 304: and acquiring an operation speed corresponding to the aiming operation, and determining the basic resistance by adopting the approach parameter (namely, the positive correlation between the operation speed and the resistance).
Step 305: the distance between the sight pattern and the target object is acquired, and the distance and the basic resistance are combined to generate resistance in the moving direction for aiming operation.
Here, the resistance coefficient corresponding to the sight pattern may be determined according to the distance between the sight pattern and the target object and the maximum distance between the target object and the boundary of the target area, and specifically, the corresponding resistance coefficient may be determined according to a proportional relationship between the distance and the maximum distance; and multiplying the basic resistance and the resistance coefficient to obtain the resistance magnitude of the sight bead pattern, so as to generate the resistance in the moving direction of the sight bead pattern based on the resistance magnitude.
Step 306: and (4) judging whether the screen is continuously slid (namely whether the sight bead pattern is continuously moved), if so, returning to the step 303, and if not, ending.
Step 307: and acquiring an operation speed corresponding to the aiming operation, and determining the basic resistance by adopting a remote parameter (namely, the operation speed and the resistance have a negative correlation).
Step 308: and acquiring the distance between the sight bead pattern and the target object, and combining the distance and the basic resistance to generate resistance aiming at the sight bead pattern in the moving direction.
Step 309: and (4) judging whether the screen is continuously slid (namely whether the sight bead pattern is continuously moved), if so, returning to the step 302, and if not, ending.
Step 310: no resistance is generated against the sight-star pattern in the direction of movement.
By applying the embodiment of the invention, a scheme for assisting aiming based on resistance is introduced, when a user aims, the user can turn left and right the sliding screen, and as the aiming position (namely the position of the sight pattern) gradually approaches to the target object, the resistance is added in the moving direction of the sight pattern to help the user to decelerate, so that the user generates a sticky feeling, the time for the user to aim and stay at the target object is prolonged, the target object cannot be quickly scratched, and the user can conveniently control the virtual shooting prop to accurately aim the target object. Moreover, the viscous resistance can not rob the 'operation right' of the user, and the perceptibility of the user during aiming can be weak, so that the user experience is improved.
Continuing with the description of the aiming device 455 of the virtual shooting prop provided in the embodiments of the present invention, in some embodiments, the aiming device of the virtual shooting prop may be implemented by a software module. Referring to fig. 16, fig. 16 is a schematic structural diagram of the aiming device 455 of the virtual shooting prop provided by the embodiment of the present invention, and the aiming device 455 of the virtual shooting prop provided by the embodiment of the present invention includes:
A presentation module 4551, configured to present a virtual shooting prop in a virtual scene and a sight pattern corresponding to the virtual shooting prop;
a control module 4552 configured to control the sight star pattern to move toward the target object in response to a targeting operation for the target object triggered based on the virtual shooting prop;
a generating module 4553, configured to generate a resistance in a moving direction for the sight pattern when the sight pattern moves to a target area centered on the target object, so as to control the virtual shooting prop to aim the target object based on the resistance.
In some embodiments, the presenting module 4551 is further configured to present, in the interface of the virtual scene, an operation control of the virtual shooting prop;
and when the operation control is in an activated state, responding to the triggering operation aiming at the operation control, and presenting the virtual shooting prop and the sight pattern corresponding to the virtual shooting prop.
In some embodiments, the presenting module 4551 is further configured to present a selection interface including at least two candidate virtual shooting properties;
and receiving a selection operation aiming at the candidate virtual shooting prop triggered based on the selection interface, and taking the selected candidate virtual shooting prop as the virtual shooting prop.
In some embodiments, the control module 4552 is further configured to present an aiming control function item corresponding to the virtual shooting prop;
controlling the sight star pattern to move towards the target object in response to a targeting operation for the target object triggered based on the targeting control function item.
In some embodiments, the control module 4552 is further configured to receive a sliding operation for a screen of the virtual scene, where the sliding operation is used to trigger a targeting operation of the virtual shooting prop for the target object;
and responding to the sliding operation, moving the picture of the virtual scene corresponding to the sight bead pattern so as to control the sight bead pattern to move towards the target object.
In some embodiments, the apparatus further comprises:
the detection module is used for determining an area detection frame corresponding to the target object, and the area detection frame corresponds to the target area;
acquiring corresponding position information of the sight bead pattern in the virtual scene;
when it is determined that the sight bead pattern is located within the area detection frame based on the position information, it is determined that the sight bead pattern moves to the target area.
In some embodiments, the generating module 4553 is further configured to acquire a distance between the sight bead pattern and the target object;
Generating a resistance force for the sight bead pattern in the direction of movement based on a negative correlation between the distance and the resistance force.
In some embodiments, the generating module 4553 is further configured to acquire a target moving direction of the sight pattern relative to the target object and an operation speed corresponding to the aiming operation;
generating a resistance force for the sight-star pattern in the direction of movement based on the target direction of movement and the operating speed.
In some embodiments, the generating module 4553 is further configured to generate a resistance to the sight pattern in the moving direction based on a positive correlation between the operating speed and the resistance when the target moving direction is the movement of the sight pattern in the direction approaching the target object;
when the target moving direction is that the sight-star pattern moves in a direction away from the target object, a resistance force for the sight-star pattern in the moving direction is generated based on a negative correlation relationship between the operating speed and the resistance force.
In some embodiments, the generating module 4553 is further configured to acquire a target moving direction of the sight pattern relative to the target object, an operation speed corresponding to the aiming operation, and a distance between the sight pattern and the target object;
Generating a resistance to the sight-star pattern in the direction of movement in combination with the target direction of movement, the operating speed, and the distance.
In some embodiments, the generating module 4553 is further configured to generate a base resistance of the sight bead pattern in the moving direction based on the target moving direction and the operating speed;
determining a resistance coefficient corresponding to the sight bead pattern based on the distance and the maximum distance between the target object and the boundary of the target area;
generating a resistance force for the sight bead pattern in a moving direction based on the base resistance force and the resistance coefficient.
In some embodiments, the presentation module is further configured to present an auxiliary targeting function item;
in response to an on instruction for the auxiliary aiming function item, adjusting the mode of aiming operation to be an auxiliary aiming mode;
the generating module 4553 is further configured to generate a resistance in a moving direction for the sight pattern when the sight pattern moves to a target area centered on the target object and the mode of the aiming operation is an auxiliary aiming mode, so as to control the virtual shooting prop to aim the target object based on the resistance.
By applying the embodiment of the invention, the virtual shooting prop in the virtual scene and the sight pattern corresponding to the virtual shooting prop are presented, when the aiming operation aiming at the target object triggered based on the virtual shooting prop is received, the sight pattern is controlled to move towards the target object in response to the aiming operation, when the sight pattern moves to the target area taking the target object as the center, the resistance of the sight pattern in the moving direction is generated, and therefore, the virtual shooting prop is controlled to aim at the target object based on the resistance. That is to say, in the aiming process for the target object, if the sight pattern moves to the area near the target object, the moving speed of the sight pattern is reduced by generating the resistance of the sight pattern in the moving direction, so that the virtual shooting prop is conveniently controlled to accurately aim the target object, the interaction times required for achieving the interaction purpose are reduced, the human-computer interaction efficiency is improved, and the occupation of hardware processing resources is reduced.
An embodiment of the present invention further provides an electronic device, where the electronic device includes:
a memory for storing executable instructions;
and the processor is used for implementing the aiming method of the virtual shooting prop provided by the embodiment of the invention when the executable instructions stored in the memory are executed.
Embodiments of the present invention also provide a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to enable the computer device to execute the aiming method of the virtual shooting prop provided by the embodiment of the invention.
The embodiment of the invention also provides a computer-readable storage medium, which stores executable instructions, and when the executable instructions are executed by a processor, the method for aiming the virtual shooting prop provided by the embodiment of the invention is realized.
In some embodiments, the computer-readable storage medium may be memory such as FRAM, ROM, PROM, EP ROM, EEPROM, flash memory, magnetic surface memory, optical disk, or CD-ROM; or may be various devices including one or any combination of the above memories. The computer may be a variety of computing devices including intelligent terminals and servers.
In some embodiments, executable instructions may be written in any form of programming language (including compiled or interpreted languages), in the form of programs, software modules, scripts or code, and may be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
By way of example, executable instructions may correspond, but do not necessarily have to correspond, to files in a file system, and may be stored in a portion of a file that holds other programs or data, such as in one or more scripts in a hypertext Markup Language (HTML) document, in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code).
By way of example, executable instructions may be deployed to be executed on one computing device or on multiple computing devices at one site or distributed across multiple sites and interconnected by a communication network.
The above description is only an example of the present invention, and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, and improvement made within the spirit and scope of the present invention are included in the protection scope of the present invention.

Claims (13)

1. A method of aiming a virtual shooting prop, the method comprising:
presenting a virtual shooting prop in a virtual scene and a sight bead pattern corresponding to the virtual shooting prop;
Controlling the sight star pattern to move towards the target object in response to a sighting operation for the target object triggered based on the virtual shooting prop;
when the sight bead pattern moves to a target area which takes the target object as a center, acquiring a target moving direction of the sight bead pattern relative to the target object, an operation speed corresponding to the aiming operation and a distance between the sight bead pattern and the target object;
generating a base resistance of the sight bead pattern in a target moving direction based on the target moving direction and the operating speed;
determining a resistance coefficient corresponding to the sight bead pattern based on the distance and the maximum distance of the target object from the boundary of the target area;
generating a resistance force for the sight-eye pattern in the target movement direction based on the base resistance force and the resistance coefficient;
and controlling the virtual shooting prop to aim at the target object based on the resistance.
2. The method of claim 1, wherein the presenting of the virtual firing prop in the virtual scene and the corresponding sight pattern of the virtual firing prop comprises:
Presenting an operation control of the virtual shooting prop in an interface of the virtual scene;
and when the operation control is in an activated state, responding to the triggering operation aiming at the operation control, and presenting the virtual shooting prop and the sight pattern corresponding to the virtual shooting prop.
3. The method of claim 1, wherein prior to presenting the virtual firing prop in the virtual scene and the corresponding sight pattern of the virtual firing prop, the method further comprises:
presenting a selection interface comprising at least two candidate virtual shooting properties;
and receiving a selection operation aiming at the candidate virtual shooting prop triggered based on the selection interface, and taking the selected candidate virtual shooting prop as the virtual shooting prop.
4. The method of claim 1, wherein the controlling the movement of the sight pattern toward the target object in response to the targeting operation for the target object triggered based on the virtual shooting prop comprises:
presenting an aiming control function item corresponding to the virtual shooting prop;
controlling the sight star pattern to move towards the target object in response to a targeting operation for the target object triggered based on the targeting control function item.
5. The method of claim 1, wherein the controlling the movement of the sight pattern toward the target object in response to the targeting operation for the target object triggered based on the virtual shooting prop comprises:
receiving a sliding operation of a picture of the virtual scene, wherein the sliding operation is used for triggering the aiming operation of the virtual shooting prop for the target object;
and responding to the sliding operation, moving the picture of the virtual scene corresponding to the sight bead pattern so as to control the sight bead pattern to move towards the target object.
6. The method of claim 1, wherein the method further comprises:
determining an area detection frame corresponding to the target object, wherein the area detection frame corresponds to the target area;
acquiring corresponding position information of the sight bead pattern in the virtual scene;
when it is determined that the sight bead pattern is located within the area detection frame based on the position information, it is determined that the sight bead pattern moves to the target area.
7. The method of claim 1, wherein generating a resistance for the sight-star pattern in the direction of target movement based on the base resistance and the resistance coefficient comprises:
Generating a resistance force for the sight bead pattern in the target movement direction based on the base resistance force, the resistance coefficient, and a negative correlation between the distance and the base resistance force.
8. The method of claim 1, wherein generating a resistance for the sight-star pattern in the direction of target movement based on the base resistance and the resistance coefficient comprises:
generating a resistance force for the sight bead pattern in the target movement direction based on the base resistance force, the resistance coefficient, the target movement direction, and the operating speed.
9. The method of claim 8, wherein generating the resistance for the sight-bead pattern in the target movement direction based on the base resistance, the resistance coefficient, the target movement direction, and the operating speed comprises:
when the target moving direction is the direction in which the sight bead pattern moves closer to the target object, generating a resistance to the sight bead pattern in the target moving direction based on the base resistance, the resistance coefficient, and a positive correlation between the operating speed and the base resistance;
When the target moving direction is the direction in which the sight bead pattern moves away from the target object, a resistance against the sight bead pattern in the target moving direction is generated based on a negative correlation relationship between the base resistance, the resistance coefficient, the operating speed, and the base resistance.
10. The method of claim 1, wherein the method further comprises:
presenting an auxiliary aiming function item;
in response to an on instruction for the auxiliary aiming function item, adjusting the mode of aiming operation to be an auxiliary aiming mode;
the generating a resistance to the sight pattern in a moving direction when the sight pattern moves to a target area centered on the target object, comprising:
when the sight pattern moves to a target area centered on the target object and the mode of the aiming operation is an auxiliary aiming mode, generating resistance to the sight pattern in the target movement direction to control the virtual shooting prop to aim the target object based on the resistance.
11. An aiming device for a virtual shooting prop, the device comprising:
The presentation module is used for presenting the virtual shooting props in the virtual scene and the sight patterns corresponding to the virtual shooting props;
the control module is used for responding to aiming operation aiming at a target object triggered based on the virtual shooting prop and controlling the sight bead pattern to move towards the target object;
a generation module, configured to acquire a target movement direction of the sight pattern relative to the target object, an operation speed corresponding to the aiming operation, and a distance between the sight pattern and the target object when the sight pattern moves to a target area centered on the target object; generating a base resistance of the sight bead pattern in a target moving direction based on the target moving direction and the operating speed; determining a resistance coefficient corresponding to the sight bead pattern based on the distance and the maximum distance of the target object from the boundary of the target area; generating a resistance force for the sight-eye pattern in the target movement direction based on the base resistance force and the resistance coefficient; and controlling the virtual shooting prop to aim at the target object based on the resistance.
12. An electronic device, characterized in that the electronic device comprises:
a memory for storing executable instructions;
a processor for implementing the method of aiming a virtual shooting prop as claimed in any one of claims 1 to 10 when executing executable instructions stored in the memory.
13. A computer-readable storage medium, having stored thereon executable instructions for, when executed, implementing a method of aiming a virtual shooting prop as claimed in any one of claims 1 to 10.
CN202011170818.XA 2020-10-28 2020-10-28 Virtual shooting prop aiming method and device, electronic equipment and storage medium Active CN112138385B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011170818.XA CN112138385B (en) 2020-10-28 2020-10-28 Virtual shooting prop aiming method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011170818.XA CN112138385B (en) 2020-10-28 2020-10-28 Virtual shooting prop aiming method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112138385A CN112138385A (en) 2020-12-29
CN112138385B true CN112138385B (en) 2022-07-29

Family

ID=73953427

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011170818.XA Active CN112138385B (en) 2020-10-28 2020-10-28 Virtual shooting prop aiming method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112138385B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113398574B (en) * 2021-07-13 2024-04-30 网易(杭州)网络有限公司 Auxiliary aiming adjustment method, auxiliary aiming adjustment device, storage medium and computer equipment
CN113577766B (en) * 2021-08-05 2024-04-02 百度在线网络技术(北京)有限公司 Object processing method and device
CN114344880A (en) * 2022-01-10 2022-04-15 腾讯科技(深圳)有限公司 Method and device for controlling foresight in virtual scene, electronic equipment and storage medium
CN116115991A (en) * 2023-02-08 2023-05-16 网易(杭州)网络有限公司 Aiming method, aiming device, computer equipment and storage medium
CN116370958A (en) * 2023-04-10 2023-07-04 上海网之易璀璨网络科技有限公司 Shooting control method and device in game, electronic equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107469353A (en) * 2017-08-02 2017-12-15 网易(杭州)网络有限公司 Method of adjustment, device and the terminal device of game camera lens
CN108744513A (en) * 2018-04-24 2018-11-06 网易(杭州)网络有限公司 Method of sight, device, electronic equipment in shooting game and storage medium
CN109701280A (en) * 2019-01-24 2019-05-03 网易(杭州)网络有限公司 The control method and device that foresight is shown in a kind of shooting game
CN111659118A (en) * 2020-07-10 2020-09-15 腾讯科技(深圳)有限公司 Prop control method and device, storage medium and electronic equipment

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002095863A (en) * 2000-07-03 2002-04-02 Sony Computer Entertainment Inc Program exciting system, program exciting apparatus, recording medium and program, and method for switching viewpoint and method for switching sight

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107469353A (en) * 2017-08-02 2017-12-15 网易(杭州)网络有限公司 Method of adjustment, device and the terminal device of game camera lens
CN108744513A (en) * 2018-04-24 2018-11-06 网易(杭州)网络有限公司 Method of sight, device, electronic equipment in shooting game and storage medium
CN109701280A (en) * 2019-01-24 2019-05-03 网易(杭州)网络有限公司 The control method and device that foresight is shown in a kind of shooting game
CN111659118A (en) * 2020-07-10 2020-09-15 腾讯科技(深圳)有限公司 Prop control method and device, storage medium and electronic equipment

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
BUNGIE手感极棒!主机FPS奠基,重新普及辅助瞄准;互联网;《https://bbs.a9vg.com/forum.php?mod=viewthread&tid=4248464》;20140916;帖子1楼 *
有必要科普一下:辅助瞄准;梧桐影;《https://tieba.baidu.com/p/6323938483》;20191104;帖子1楼 *

Also Published As

Publication number Publication date
CN112138385A (en) 2020-12-29

Similar Documents

Publication Publication Date Title
CN112090069B (en) Information prompting method and device in virtual scene, electronic equipment and storage medium
CN112138385B (en) Virtual shooting prop aiming method and device, electronic equipment and storage medium
CN112076473B (en) Control method and device of virtual prop, electronic equipment and storage medium
CN113797536B (en) Control method, device, equipment and storage medium for objects in virtual scene
KR102706744B1 (en) Method and apparatus, device, storage medium and program product for controlling virtual objects
CN112402960B (en) State switching method, device, equipment and storage medium in virtual scene
CN112121414B (en) Tracking method and device in virtual scene, electronic equipment and storage medium
CN112295230B (en) Method, device, equipment and storage medium for activating virtual props in virtual scene
JP7447296B2 (en) Interactive processing method, device, electronic device and computer program for virtual tools
CN112057860B (en) Method, device, equipment and storage medium for activating operation control in virtual scene
CN113633964B (en) Virtual skill control method, device, equipment and computer readable storage medium
CN112402959A (en) Virtual object control method, device, equipment and computer readable storage medium
CN112121434A (en) Interaction method and device of special effect prop, electronic equipment and storage medium
CN113101667A (en) Virtual object control method, device, equipment and computer readable storage medium
CN113559510B (en) Virtual skill control method, device, equipment and computer readable storage medium
CN112156472B (en) Control method, device and equipment of virtual prop and computer readable storage medium
CN112870702B (en) Recommendation method, device and equipment for road resources in virtual scene and storage medium
CN112121432B (en) Control method, device and equipment of virtual prop and computer readable storage medium
CN112870694B (en) Picture display method and device of virtual scene, electronic equipment and storage medium
CN113144617B (en) Control method, device and equipment of virtual object and computer readable storage medium
CN112121433B (en) Virtual prop processing method, device, equipment and computer readable storage medium
CN113769379A (en) Virtual object locking method, device, equipment, storage medium and program product
CN114130006A (en) Control method, device, equipment, storage medium and program product of virtual prop
CN114146414A (en) Virtual skill control method, device, equipment, storage medium and program product
CN114146413B (en) Virtual object control method, device, equipment, storage medium and program product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant