CN114390199A - Shooting method and electronic equipment - Google Patents
Shooting method and electronic equipment Download PDFInfo
- Publication number
- CN114390199A CN114390199A CN202210022150.7A CN202210022150A CN114390199A CN 114390199 A CN114390199 A CN 114390199A CN 202210022150 A CN202210022150 A CN 202210022150A CN 114390199 A CN114390199 A CN 114390199A
- Authority
- CN
- China
- Prior art keywords
- resolution
- image
- electronic device
- camera
- video
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 60
- 230000004044 response Effects 0.000 claims abstract description 52
- 238000004590 computer program Methods 0.000 claims description 11
- 238000003860 storage Methods 0.000 claims description 11
- 238000005520 cutting process Methods 0.000 claims description 2
- 230000009286 beneficial effect Effects 0.000 abstract description 5
- 238000012545 processing Methods 0.000 description 31
- 230000006870 function Effects 0.000 description 25
- 238000004891 communication Methods 0.000 description 19
- 238000007726 management method Methods 0.000 description 16
- 238000013461 design Methods 0.000 description 14
- 238000010586 diagram Methods 0.000 description 14
- 238000010295 mobile communication Methods 0.000 description 12
- 230000005236 sound signal Effects 0.000 description 9
- 230000008569 process Effects 0.000 description 7
- 230000003287 optical effect Effects 0.000 description 5
- 229920001621 AMOLED Polymers 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 230000003321 amplification Effects 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 2
- 238000013500 data storage Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 230000036541 health Effects 0.000 description 2
- 230000001965 increasing effect Effects 0.000 description 2
- 238000003199 nucleic acid amplification method Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 208000002874 Acne Vulgaris Diseases 0.000 description 1
- 206010000496 acne Diseases 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000003416 augmentation Effects 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 210000000988 bone and bone Anatomy 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 239000004020 conductor Substances 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000001939 inductive effect Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000007935 neutral effect Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 239000002096 quantum dot Substances 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/631—Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
- H04N23/632—Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/667—Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Telephone Function (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
A shooting method and electronic equipment relate to the technical field of terminals, wherein the method is applied to the electronic equipment, the electronic equipment comprises a camera and a display screen, and the method specifically comprises the following steps: responding to a first operation, displaying a preview interface of a first video, wherein the resolution of the first video is a first resolution; setting the resolution of the video to a second resolution in response to a second operation; the aspect ratio of the second resolution is greater than the aspect ratio of the first resolution; and responding to a third operation, and displaying a preview interface of a second video, wherein the resolution of the second video is a second resolution. Therefore, the technical scheme is beneficial to improving the efficiency of acquiring the video or the picture and reducing the labor cost.
Description
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a shooting method and an electronic device.
Background
At present, the aspect ratio of the resolution of videos shot by electronic devices such as mobile phones and tablet computers is usually 4:3 or 16: 9. If a wide frame video with an aspect ratio of 21:9 or more is desired, in the prior art, after a video with an aspect ratio of 4:3 or 16:9 is recorded by an electronic device such as a mobile phone, the recorded video with an aspect ratio of 4:3 or 16:9 needs to be manually processed into a wide frame video with an aspect ratio of 21:9 or more by software.
Therefore, the method for acquiring the wide-frame video has high labor cost and low efficiency.
Disclosure of Invention
The application provides a shooting method and electronic equipment, which are beneficial to improving the efficiency of obtaining videos or pictures and reducing labor cost.
In a first aspect, a method for editing content provided in an embodiment of the present application is applied to an electronic device, where the electronic device includes a camera and a display screen, and the method includes:
responding to a first operation, displaying a preview interface of a first video, wherein the resolution of the first video is a first resolution;
setting the resolution of the video to a second resolution in response to a second operation; the aspect ratio of the second resolution is greater than the aspect ratio of the first resolution; and responding to a third operation, and displaying a preview interface of a second video, wherein the resolution of the second video is a second resolution.
Because the electronic equipment in the embodiment of the application can display the preview interface of the video according to the resolution of the video, the electronic equipment can directly shoot the image meeting the resolution requirement, for example, the image with the resolution ratio of greater than or equal to 2:1, so that compared with the prior art that the shot image is manually post-processed to obtain the image meeting the resolution requirement, the method and the device are beneficial to improving the efficiency of obtaining the video or the picture and reducing the labor cost.
In one possible design, the second video is obtained by capturing or stretching the image captured by the camera according to the second resolution. Facilitating a simplified way to meet the video of the second resolution.
In a possible design, when the camera includes a first camera and a second camera, and a range in which the second camera captures an image is greater than a range in which the first camera captures an image, the first video is obtained by cutting or stretching an image captured by the second camera based on the second resolution. By the technical scheme, the acquired resolution image with the large length-width ratio can show more contents to a user.
In one possible design, the aspect ratio of the second resolution is greater than a preset threshold. The implementation mode of switching the cameras is facilitated to be simplified.
In one possible design, when the first resolution is less than or equal to the preset threshold, the first video is obtained based on the first resolution for the image captured by the first camera. The method is beneficial to simplifying the image processing mode of the electronic equipment.
In one possible design, the first camera is switched to the second camera after the setting of the resolution of the video to the second resolution.
In one possible design, the display screen displays a prompt message for prompting a user to switch the camera from the second camera to the first camera. Therefore, the user can know the working condition of the electronic equipment conveniently.
In some embodiments, the aspect ratio of the first resolution is 4: 3. 16:9, or 2: 1.
in some embodiments, the aspect ratio of the second resolution is 21:9, 2.37:1, 2.39: 1. 2.55:1, 2.59:1, or 2.66: 1.
In some embodiments, the electronic device automatically starts a skin beautifying function if detecting that the image acquired by the camera includes a face image. Thereby contributing to an improved user experience.
In a second aspect, an embodiment of the present application provides an electronic device, including: a display screen, a camera, one or more processors, memory, a plurality of applications, and one or more computer programs; wherein one or more computer programs are stored in the memory, which when executed by the electronic device, implement the first aspect of the embodiments of the present application and any of the possible design methods provided by the first aspect.
In a third aspect, a chip provided in this embodiment of the present application is coupled to a memory in an electronic device, so that the chip invokes a computer program stored in the memory when running, so as to implement the method provided in the first aspect of this embodiment and any possible design provided in the first aspect of this embodiment.
In a fourth aspect, a computer storage medium of an embodiment of the present application stores a computer program, which, when run on an electronic device, causes the electronic device to execute the first aspect and any one of the possible design methods of the first aspect.
In a fifth aspect, a computer program product according to an embodiment of the present application, when running on an electronic device, causes the electronic device to perform the first aspect and any one of the possible design methods of the first aspect.
In addition, the technical effects brought by any one of the possible design manners in the second aspect to the fifth aspect can be referred to the technical effects brought by the different design manners in the first aspect, and are not described herein again.
Drawings
Fig. 1 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
FIG. 3 is a diagram illustrating compressing an image in a width direction according to an embodiment of the present disclosure;
fig. 4 is a schematic diagram of the number of cameras in the embodiment of the present application;
FIG. 5 is a schematic diagram of a user interface of an embodiment of the present application;
FIG. 6 is a schematic view of a video setup interface according to an embodiment of the present disclosure;
FIG. 7a is a schematic diagram of a video preview interface when an image with a resolution of 640 × 480 is displayed on a horizontal screen according to an embodiment of the present application;
FIG. 7b is a diagram of a video preview interface when an image with a resolution of 2560 × 1080 is displayed in a landscape screen according to an embodiment of the present application;
FIG. 7c is a diagram illustrating a preview interface when displaying a video mode selection menu according to an embodiment of the present application;
fig. 7d is a schematic view of a video recording interface according to an embodiment of the present application;
fig. 8 is a schematic view of the rear camera according to the embodiment of the present application;
FIG. 9 is a schematic diagram of a user interface for prompting the start of a camera according to an embodiment of the present application;
FIG. 10 is a schematic diagram of a setup interface for taking a picture according to an embodiment of the present application;
FIG. 11 is a diagram illustrating a preview interface for taking a photograph when an image with a resolution of 2560 × 1080 is displayed in a landscape screen according to an embodiment of the present application;
FIG. 12 is a flowchart illustrating a photographing method according to an embodiment of the present application;
fig. 13 is a schematic structural diagram of another electronic device according to an embodiment of the present application.
Detailed Description
It should be understood that "at least one" in the embodiments of the present application means one or more, "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, meaning that three relationships may exist. For example, a and/or B, may represent the following three relationships: a exists alone, A and B exist simultaneously, and B exists alone. A, B may be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of the singular or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, a and b, a and c, b and c, or a, b and c, wherein a, b and c can be single or multiple.
It should be understood that the embodiment of the application can be applied to electronic equipment. For example, the electronic device may be a portable electronic device, such as a cell phone, a tablet computer, a wearable device with wireless communication functionality (e.g., a smart watch), an in-vehicle device, and so on. Exemplary embodiments of the portable electronic device include, but are not limited to, a mountOr other operating system. The portable electronic devices may be thoseSuch as a Laptop computer (Laptop) with a touch sensitive surface (e.g., a touch panel), etc. It should also be understood that in other embodiments of the present application, the electronic device may also be a desktop computer with a touch-sensitive surface (e.g., a touch panel).
Generally, the electronic device of the embodiment of the present application can support a plurality of applications. Such as one or more of the following applications: camera, drawing, presentation, word processing, gaming, telephony, video player, music player, email, instant messaging, gallery, browser, calendar, clock, payment, and health management applications.
For example, fig. 1 shows a hardware structure diagram of an electronic device to which an embodiment of the present application is applicable. As shown in fig. 1, the electronic device includes a processor 110, an internal memory 121, an external memory interface 122, an antenna 1, a mobile communication module 131, an antenna 2, a wireless communication module 132, an audio module 140, a speaker 140A, a receiver 140B, a microphone 140C, an earphone interface 140D, a display screen 151, a Subscriber Identity Module (SIM) card interface 152, a camera 153, a key 154, a sensor module 160, a Universal Serial Bus (USB) interface 170, a charging management module 180, a power management module 181, and a battery 182. In other embodiments, the electronic device may also include a motor, an indicator, and the like.
In some embodiments, a memory may also be provided in processor 110 for storing instructions and data. By way of example, the memory in the processor 110 may be a cache memory. The memory may be used to hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Thereby helping to avoid repeated accesses and reducing the latency of the processor 110, thereby increasing the efficiency of the system.
The internal memory 121 may be used to store computer executable program code. The executable program code includes instructions. The processor 110 executes various functional applications of the electronic device and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area can store data (such as audio data, phone book and the like) created in the using process of the electronic device. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
The external memory interface 122 may be used to connect an external memory card (e.g., a Micro SD card) to extend the storage capability of the electronic device. The external memory card communicates with the processor 110 through the external memory interface 122 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in an electronic device may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 131 may provide a solution including 2G/3G/4G/5G wireless communication applied on the electronic device. The mobile communication module 131 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 131 can receive the electromagnetic wave signal from the antenna 1, and perform filtering, amplification, and other processing on the received electromagnetic wave signal, and transmit the electromagnetic wave signal to the modem processor for demodulation. The mobile communication module 131 can also amplify the signal modulated by the modem processor, and convert the signal into an electromagnetic wave signal through the antenna 1 to radiate the electromagnetic wave signal. In some embodiments, at least part of the functional modules of the mobile communication module 131 may be provided in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 131 may be disposed in the same device as at least some of the modules of the processor 110. For example, the mobile communication module 131 may transmit and receive voice to and from other electronic devices.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 140A, the receiver 140B, etc.) or displays an image or video through the display screen 151. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 131 or other functional modules, independent of the processor 110.
The wireless communication module 132 may provide a solution for wireless communication applied to an electronic device, including Wireless Local Area Networks (WLANs), such as Wi-Fi networks, Bluetooth (BT), Global Navigation Satellite Systems (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 132 may be one or more devices integrating at least one communication processing module. The wireless communication module 132 receives the electromagnetic wave signal via the antenna 2, performs frequency modulation and filtering processing on the electromagnetic wave signal, and transmits the processed signal to the processor 110. The wireless communication module 132 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into an electromagnetic wave signal through the antenna 2 to radiate the signal.
In some embodiments, antenna 1 of the electronic device is coupled to the mobile communication module 131 and antenna 2 is coupled to the wireless communication module 132 so that the electronic device can communicate with the network and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device may implement audio functions through the audio module 140, the speaker 140A, the receiver 140B, the microphone 140C, the headphone interface 140D, the application processor, and the like. Such as music playing, recording, etc.
The audio module 140 may be used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 140 may also be used to encode and decode audio signals. In some embodiments, the audio module 140 may be disposed in the processor 110, or some functional modules of the audio module 140 may be disposed in the processor 110.
The speaker 140A, also called a "horn", is used to convert audio electrical signals into sound signals. The electronic device can listen to music or a hands-free call through the speaker 140A.
The receiver 140B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic device answers a call or voice information, the receiver 140B can be close to the ear to answer the voice.
The microphone 140C, also known as a "microphone," is used to convert sound signals into electrical signals. When making a call or sending voice information, the user may speak via the mouth of the user near the microphone 140C, which may be used to capture the user's voice and then convert the user's voice into an electrical signal. The electronic device may be provided with at least one microphone 140C. In other embodiments, the electronic device may be provided with two microphones 140C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device may further include three, four, or more microphones 140C to collect and reduce sound signals, identify sound sources, and perform directional recording.
The headphone interface 140D is used to connect wired headphones. The headset interface 140D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface, or the like.
The electronic device may implement display functions through the GPU, the display screen 151, and the application processor, etc. The GPU is a microprocessor for image processing, and is connected to the display screen 151 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 151 may be used to display images, videos, and the like. The display screen 151 may include a display panel. The display panel may be a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), or the like. In some embodiments, the electronic device may include 1 or N display screens 151, N being a positive integer greater than 1.
The electronic device may implement a photographing function through the camera 153, the ISP, the DSP, the video codec, the display screen 151, and the application processor, etc.
The camera 153 may be used to capture still images or video. Illustratively, as shown in fig. 2, the camera 153 includes a lens and an image sensor. The lens is used for converging light rays so as to collect optical images. The object is projected to the image sensor to be imaged through an optical image collected by the lens. The lens may be a standard lens, an anamorphic lens, or a lens with other characteristics, which is not limited. The standard lens images the acquired image on the image sensor in an equal proportion, that is, the proportion of the image acquired by the standard lens in the width direction and the height direction is the same as the proportion of the image in the width direction and the height direction after the image is imaged on the image sensor. Unlike a standard lens, an optical image captured by an anamorphic lens is compressed in the width direction and then appears on an image sensor. For example, as shown in fig. 3, the range acquired by the anamorphic lens in the width direction is the range shown in the image 301, the range acquired in the height direction is the range shown in the image 301, and the image 302 is the image after the anamorphic lens images the image 301 onto the image sensor. As can be seen in fig. 3, image 302 is compressed in the width direction but uncompressed in the height direction compared to image 301.
It should be noted that in the embodiment of the present application, the electronic device may include one or N cameras 153, where N is a positive integer greater than or equal to 2. Illustratively, as shown in fig. 4, the electronic device includes 5 cameras, wherein the camera 153A and the camera 153B are located on the front side of the electronic device, which may be referred to as front cameras, and the camera 153C, the camera 153D and the camera 153E are located on the back side of the electronic device, which may be referred to as rear cameras. Taking fig. 4 as an example, the lenses of the camera 153A, the camera 153B, the camera 153C, the camera 153D and the camera 153E may all be standard lenses, or may be partially standard lenses, and the other part is an anamorphic lens. For example, the lenses of the cameras 153A, 153C, and 153E are standard lenses, and the lenses of the cameras 153B and 153D are anamorphic lenses. When the electronic equipment shoots a wide-frame image, the anamorphic lens is used for collecting the image, so that the subsequent image processing process is facilitated to be simplified, and more contents are presented in the image shot by the electronic equipment.
It should be noted that the anamorphic lens may also be referred to as an anamorphic wide-screen lens. The image sensor may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The image sensor converts the optical signal into an electrical signal and then delivers the electrical signal to the ISP.
Upon receipt of the electrical signal, the ISP can convert the electrical signal into a digital image signal having an aspect ratio that meets the resolution selected by the user. The ISP may send the digital image signal to the image processor for post-image processing, such as video color enhancement, video denoising, and the like. The image processor may be a DSP or other device for performing image processing. In addition, the ISP can also directly perform post-image processing after obtaining the digital image signal, such as performing algorithm optimization on noise, brightness, and color of the image. In some embodiments, the ISP may also optimize parameters such as exposure, color temperature, etc. of the shooting scene. In some embodiments, the ISP may be provided in an image sensor in the camera 153. After the digital image signal is processed by the later image processing, the processed digital image signal is output to a video coder-decoder for compression coding. The video codec may convert the digital image signal into an image signal in a standard RGB, YUV, or the like format and then output it, for example, the video codec may present the output image signal on the display screen 151, or the video codec may save the output image signal to the internal memory 121 or an external memory connected to the external memory interface 122.
The keys 154 may include a power-on key, a volume key, and the like. The keys 154 may be mechanical keys. Or may be touch keys. The electronic device may receive a key input, and generate a key signal input related to user settings and function control of the electronic device.
The sensor module 160 may include one or more sensors. For example, the touch sensor 160A, the fingerprint sensor 160B, the gyro sensor 160C, the pressure sensor 160D, the acceleration sensor 160E, and the like. In some embodiments, the sensor module 160 may also include environmental sensors, distance sensors, proximity light sensors, bone conduction sensors, and the like.
The touch sensor 160A may also be referred to as a "touch panel". The touch sensor 160A may be disposed on the display screen 151, and the touch sensor 160A and the display screen 151 form a touch screen, which is also called a "touch screen". The touch sensor 160A is used to detect a touch operation applied thereto or therearound. Touch sensor 160A may pass the detected touch operation to an application processor to determine the touch event type. Visual output related to the touch operation may be provided through the display screen 151. In other embodiments, the touch sensor 160A may be disposed on a surface of the electronic device at a different position from the position of the display screen 151.
The fingerprint sensor 160 may be used to capture a fingerprint. The electronic equipment can utilize the collected fingerprint characteristics to realize fingerprint unlocking, application lock access, fingerprint photographing, incoming call answering and the like.
The gyro sensor 160C may be used to determine the motion pose of the electronic device. In some embodiments, the angular velocity of the electronic device about three axes (i.e., x, y, and z axes) may be determined by gyroscope sensor 160C. The gyro sensor 160C may be used for photographing anti-shake. For example, when the shutter is pressed, the gyroscope sensor 160C detects a shake angle of the electronic device, calculates a distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the electronic device through a reverse movement, thereby achieving anti-shake. The gyro sensor 160C may also be used for navigation, body sensing game scenes.
The pressure sensor 160D is used for sensing a pressure signal, and can convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 160D may be disposed on the display screen 151. The pressure sensor 160D may be of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronics determine the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic device detects the intensity of the touch operation according to the pressure sensor 180A. The electronic device may also calculate the position of the touch from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The acceleration sensor 160E can detect the magnitude of acceleration of the electronic device in various directions (typically three axes). When the electronic device is at rest, the magnitude and direction of gravity can be detected. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
In other embodiments, processor 110 may also include one or more interfaces. For example, the interface may be a SIM card interface 152. Also for example, the interface may be a USB interface 170. For example, the interface may also be an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, or the like. It is understood that the processor 110 according to the embodiment of the present application may interface different modules of the electronic device, so that the electronic device can implement different functions. Such as taking a picture, processing, etc. In the embodiments of the present application, the connection method of the interface in the electronic device is not limited.
The SIM card interface 152 may be used to connect a SIM card, among other things. The SIM card can be attached to and detached from the electronic device by being inserted into the SIM card interface 152 or being pulled out from the SIM card interface 152. The electronic equipment can support 1 or N SIM card interfaces, and N is a positive integer greater than 1. The SIM card interface 152 may support a Nano SIM card, a Micro SIM card, a SIM card, or the like. Multiple cards can be inserted into the same SIM card interface 152 at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 152 may also be compatible with different types of SIM cards. The SIM card interface 152 may also be compatible with an external memory card. The electronic equipment realizes functions of conversation, data communication and the like through the interaction of the SIM card and the network. In some embodiments, the electronic device employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device and cannot be separated from the electronic device.
The USB interface 170 is an interface conforming to the USB standard specification. For example, the USB interface 170 may include a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like. The USB interface 170 may be used to connect a charger to charge the electronic device, and may also be used to transmit data between the electronic device and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The USB interface 170 may also be used to connect other electronic devices, such as Augmented Reality (AR) devices, and the like.
The charge management module 180 is configured to receive a charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 180 may receive charging input from a wired charger via the USB interface 170. In some wireless charging embodiments, the charging management module 180 may receive a wireless charging input through a wireless charging coil of the electronic device. The charging management module 140 may also supply power to the electronic device through the power management module 181 while charging the battery 182.
The power management module 181 is used to connect the battery 182, the charging management module 180 and the processor 110. The power management module 181 receives an input of the battery 182 and/or the charging management module 180, and supplies power to the processor 110, the internal memory 121, the external memory, the display screen 151, the camera 153, the mobile communication module 131, the wireless communication module 132, and the like. The power management module 181 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), and the like. In some other embodiments, the power management module 181 may also be disposed in the processor 110. In other embodiments, the power management module 181 and the charging management module 180 may be disposed in the same device.
It should be understood that the hardware configuration shown in fig. 1 is only one example. The electronic devices of the embodiments of the application may have more or fewer components than shown in the figures, may combine two or more components, or may have different configurations of components. The various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
The following describes the imaging method according to the embodiment of the present application in detail with reference to the configurations shown in fig. 1 and 2.
According to the embodiment of the application, the electronic equipment can cut and/or compress the image collected by the camera to directly output the wide-frame video or image with high resolution ratio and large length-width ratio, so that the labor cost is reduced, and the efficiency of obtaining the wide-frame video or image is improved. For example, the resolution of the image 302 as shown in fig. 3 is 640 × 480, where 640 is the pixel value of the image 302 in the width direction and 480 is the pixel value of the image 302 in the height direction. The resolution of image 302 has an aspect ratio of 640/480-4: 3. It is understood that the aspect ratio may also be referred to as an aspect ratio or a picture ratio, etc.
The embodiment of the application can be applied to the application with the shooting function, such as a camera, an instant message with a video function and the like. The method of the embodiment of the present application will be described in detail by taking a camera as an example.
Illustratively, the display screen 151 of the electronic device displays a home interface, wherein the home interface includes a camera icon. By way of example, the main interface may be the user interface 500 shown in FIG. 5. Therein, user interface 500 includes a camera icon 501. In addition, the user interface 500 may include icons for other applications, such as a settings icon, a memo icon, a gallery icon, and the like. In some embodiments, user interface 500 may also include a status bar 502, a concealable navigation bar 504, and a Dock bar 503. The status bar 502 may include a name of an operator (e.g., china mobile, etc.), a mobile network (e.g., 4G), a bluetooth icon, time, and remaining power. Further, it is understood that in other embodiments, a WiFi icon, an add-on icon, etc. may also be included in the status bar 502. The navigation bar 504 may include a back button (back button), a home screen button (home button), and a history task view button (menu button). Icons for common applications, such as phone icons, information icons, mail icons, and weather icons, may be included in Dock bar 503. It should be noted that the icons in the Dock column may be set according to the needs of the user.
The electronic device may respond to the first operation, start an application corresponding to the camera icon 501, and display a preview interface on the display screen 151, where an image captured by the camera 153 is displayed on the preview interface. It is understood that the electronic device may also turn on the camera 153 in response to the first operation. In addition, the camera 153 may be always on. It should be understood that the application corresponding to the camera icon 501 may be referred to as a camera, and may also be referred to by other names, which are not limited. The following embodiment refers to the application to which the camera icon 501 corresponds as a camera. The first operation may be an operation on the camera icon 501, an operation on a voice instruction of the user (for example, "turn on the camera"), or a shortcut gesture operation (for example, three fingers slide down). Here, the operation on the camera icon 501 may be a touch operation, a press operation, or the like on the camera icon 501. In addition, in the embodiment of the application, when the electronic equipment is in a blank screen state or a screen locking state. The camera may be started in response to a voice instruction or a shortcut gesture operation of the user, and a preview interface may be displayed on the display screen 151.
In particular implementations, in some embodiments, the preview interface may be a photo preview interface. Illustratively, the user interface 510 is shown in FIG. 5. Taking the rear camera as an example, the user interface 510 displays the image captured by the rear camera of the electronic device. In other embodiments, the user interface 510 may also display images captured by a front-facing camera of the electronic device. In other embodiments, the preview interface may be a video preview interface. An exemplary video preview interface may be the user interface 520 shown in fig. 5. Taking the rear camera as an example, the user interface 520 displays a video image captured by the rear camera of the electronic device. In other embodiments, the user interface 520 displays video images that may also be captured for a front facing camera of the electronic device. It should be noted that the electronic device according to the embodiment of the present application may implement switching between the front camera and the rear camera in response to an operation of the camera switching button 514 shown in fig. 5.
The following description will take an example in which the electronic device responds to the first operation, displays the user interface 510 on the display screen 151, and the image displayed by the user interface 510 is captured by a rear camera of the electronic device. In some embodiments, as shown in FIG. 5, user interface 510 may also include a shooting mode selection button area 511, which may include a take, record, professional, portrait, large aperture, night view, or more mode selection buttons. The user can view the photographing mode of the electronic device by sliding left and right in the photographing mode selection button region 511. The electronic device may display a user interface 520 on the display screen 151 in response to an operation of selecting a video recording on the user interface 510. The user interface 520 displays a video image acquired by the rear camera. Specifically, the electronic device is based on the resolution set by the user to display a video image captured by the rear camera on the user interface 520. It should be understood that the user may set the resolution on the setup interface of the video. In some embodiments, user interface 520 includes a set button 521. The electronic apparatus can display a setting interface of the video on the display screen 151 in response to an operation of the setting button 521 by the user. In other embodiments, the electronic device may also display a setting interface for the video in response to a user sliding left or right on the user interface 520. It should be noted that the electronic apparatus may also display a setting interface of the video on the display screen 151 in response to other operations (e.g., operations of pulling down, sliding up, etc.). By way of example, the setup interface for a video may be as shown in the user interface 610 shown in FIG. 6. The user interface 610 includes a resolution setting button 611. From the resolution setting button 611, it can be seen that the resolution of the image currently selected by the user is 640 × 480, wherein in the case of the resolution of 640 × 480, the aspect ratio is 4: 3. The user interface 520 displays a video image with a resolution having an aspect ratio of 4: 3. Typically, the electronic device displays an image having a 4:3 aspect ratio in a portrait screen for easy viewing by a user, such as the user interface 520 shown in FIG. 5. The electronic device may also display the user interface 520 across the screen, for example, as shown in fig. 7 a.
The electronic device may display a user interface 620 on the display screen 151 in response to operation of the resolution option button 611. The user interface 620 includes options for multiple video resolutions supported by the electronic device, such as 4K UHD 3840 × 2160, 1080P FHD + (18:9)2160 × 1080, 1080P FHD (60fps)1920 × 1080, 1080P FHD + (21:9)2560 × 1080, 720P HD + (21:9)1680 × 720, 720P 1280 × 720, VGA (4:3)640 × 480, QVGA (4:3)320 × 240. As can be seen from the user interface 620 shown in fig. 6, the resolution of the video with the aspect ratio of 21:9 may be 1080P FHD + (21:9)2560 × 1080 and 720P HD + (21:9), it should be noted that the resolution of the video with the aspect ratio of 21:9 may also be UHD +4K 5040 × 2160, and the user interface 620 shown in fig. 6 of the present application is only a schematic diagram and does not constitute a limitation to the embodiments of the present application. The electronic device in the embodiment of the present application may support resolutions such as aspect ratios of 2.37:1(64:27), 2.39:1 (12: 5), 2.55:1 (23: 9), 2.59:1 (13: 5), and 2.66:1(8:3, 24:9) in addition to resolutions such as aspect ratios of 4:3, 16:9, and 21:9, and the aspect ratio of the resolution supported by the electronic device is not limited in the embodiment of the present application.
The user can select the corresponding resolution according to the requirement of the user. For example, the electronic device may set the resolution of the image when the video was captured to 2560 × 1080 in response to the user selecting an option of 2560 × 1080 resolution. In some embodiments, user interface 620 also includes a back button 621 to facilitate user operation, exiting user interface 620. Accordingly, the electronic device may exit user interface 620 and return to user interface 610 in response to operation of return button 621. In this case, the resolution setting button 611 included in the user interface 610 displays a resolution update of 2560 × 1080. In other embodiments, the electronic device may also exit the user interface 620 and return to the user interface 610 in response to a shortcut gesture operation (e.g., a swipe left on the user interface 620, etc.). The user interface 610 may also include a back button 612 to facilitate user operation to exit the user interface 610. For example, if the electronic device is entered through the setting button 521 included in the user interface 520, the electronic device may exit the user interface 610 and display a preview interface of the video in response to an operation of the return button. Since the current resolution of 2560 x 1080 has an aspect ratio of 4:3 for a resolution of 640 x 480 and 21:9 for a resolution of 2560 x 1080 as compared to a resolution of 640 x 480 when the user interface 520 is displayed by the display screen 151, in some embodiments, the electronic device may display the video image landscape in response to the resolution being set to 2560 x 1080, thereby facilitating a better display of the video image with a resolution of 2560 x 1080. For example, after the electronic device exits the user interface 610, the displayed photo preview interface may be the user interface 700 shown in FIG. 7 b. The video image collected by the rear camera is displayed in the user interface 700, and the resolution of the video image is 2560 × 1080.
In another embodiment, at least one of a zoom option area 704, a video color selection button 701, a high dynamic range button 702, a flash button 703, and a video mode selection button 705 may also be included on the user interface 700. For example, the electronic device may set a zoom factor used in shooting in response to selecting the zoom factor in zoom option area 704. As shown in fig. 7b, the zoom factor currently selected by the user is × 1. In some embodiments, the zoom factor options included in the zoom option area 704 may include x 5, x 3, x 1, x 0.75, and so on. It should be noted that, because the zoom factor is increased, the long-range view can be zoomed in, the range of the captured image is small, but the zoom factor is reduced, the range of the captured image can be expanded, and the captured image can display a larger picture. Therefore, in some embodiments, taking the rear-facing camera as an example, if the electronic device includes two or more rear-facing cameras and the focal lengths of the different rear-facing cameras are different, the electronic device may perform the switching of the rear-facing cameras in response to the different zoom multiples selected by the user. Thereby facilitating the acquisition of the image. Illustratively, the electronic device includes 3 rear cameras, wherein the focal lengths of the 3 rear cameras are 3.6mm, 4mm and 6mm, respectively, then when the electronic device responds to a user-selected zoom factor of x 0.75, the rear camera having a focal length of 3.6mm is turned on. When the electronic equipment starts the rear camera with the focal length of 3.6mm, the electronic equipment can respond to the fact that a user selects the zoom multiple to be multiplied by 1, and switches to the rear camera with the focal length of 4mm, so that the electronic equipment can acquire images based on the rear camera with the focal length of 4 mm. As another example, the electronic device switches to a 6mm rear camera when the zoom factor is x 3 or x 5 in response to a user selection.
The electronic device may display options of video colors, such as standard, bright, soft, etc., on the display screen 151 in response to user manipulation of the video color selection button 701. The electronic device may set the color of the captured video in response to the user's selection of the video color. The electronic device may turn the high dynamic range setting on or off in response to user operation of the high dynamic range button 702. The electronic device may turn the flash on or off in response to user operation of the flash off button 703.
In addition, the electronic apparatus may also set a mode of capturing video in response to an operation of the video mode selection button 705. The video mode may include portrait blurring, AI color, and neutral mode, among others. For example, the electronic device may display a video mode selection menu at the user interface 700 in response to operation of the video mode selection button 705. Wherein different video modes may be included in the video mode selection menu. For example, as shown in fig. 7c, the video modes included in the video mode selection menu 707 are: none, AI color, portrait blurring, nostalgia, suspicion, and freshness. The user can select a corresponding video mode from the video mode selection menu 707 according to the requirement of the user. It should be noted that, in some embodiments, the video modes included in the video mode selection menu 707 may further include video modes other than those displayed on the user interface 700 shown in fig. 7c, that is, when the video modes included in the video mode selection menu 707 cannot be all displayed on the display screen 151, the user may select one video mode according to his/her selection by sliding up and down in the video mode selection menu 707 to view the video modes. Therefore, the electronic equipment can respond to different modes selected by a user to realize different processing on video images collected by the camera. For example, when the user selects a character blurred video mode, the electronic device may perform character blurring on a video image captured by the camera in response to the user selected character blurred video mode. In some embodiments, the electronic device may hide the video mode selection menu 707 in response to a user-selected character blurring video mode, or in response to operation of the video mode selection button 705, or other operation. In addition, in some embodiments, when the electronic device includes multiple cameras, different cameras may also be turned on in response to different modes being selected, thereby enabling the electronic device to facilitate image processing. For example, the electronic device includes two rear cameras, and when a user selects a video mode of an AI color, the two rear cameras can be turned on simultaneously, and then the images collected by the two rear cameras are processed and displayed on the video preview interface.
For example, when the electronic device includes a rear camera, in the case that the lens of the rear camera is a standard lens, if an image captured by the lens of the rear camera has a ratio of 4:3 in the width direction and the height direction and an image imaged behind the image sensor has a ratio of 4:3 in the width direction and the height direction, the electronic device may capture a video image having a resolution of 2560 × 1080 based on the following method.
The first embodiment is as follows: the image sensor can send a 4:3 image to the ISP, and after the ISP receives the 4:3 image, the image collected by the rear camera is cut into 21:9 in the width direction and the height direction according to the resolution 2560X 1080 selected by the user, and then the image is scaled into an image with the resolution 2560X 1080. The 2560 × 1080 resolution image is then sent by the ISP to an image processor for processing, such as denoising, color adjustment, etc. After the image processor completes processing the image, the processed image is sent to a video codec, which encodes the image into a corresponding file format (e.g., MP4, 3GP, etc.) and displays it on the display screen 151. The electronic device may start capturing a 2560 × 1080 resolution video image using the rear camera in response to a user's operation of the shooting button 513 or other shortcut operations (e.g., voice instruction, pressing a volume key, etc.), and store the video image in the internal memory 121 or an external memory connected to the external memory interface 122. In some embodiments, the electronic device may display a video recording interface on the display screen 151 in response to user operation of the capture button 513. For example, the rear camera collects images, and the images collected by the rear camera are displayed on the video recording interface. For example, the video recording interface may be the user interface 710 shown in fig. 7 d. For example, the user interface 710 may include a recording time. In some embodiments the user interface 710 includes at least one of a take button 513, a pause button 711, and a take button 712. The electronic apparatus 100 may end recording the video and save the recorded video in response to the operation of the photographing button 513 during the recording of the video. In some embodiments, the electronic device 100 displays a video preview interface on the display screen 151 after finishing recording the video. The electronic device may pause the recording of the video in response to operation of the pause button 711 during the recording of the video. The recording of the video may also be continued in response to the operation of the pause button 711 after the electronic device is pausing the recording of the video. The electronic device may take a picture of the currently recorded video and save the taken picture in response to the operation of the take picture button 712. For example, as shown in fig. 7d, the current time for recording the video is 00:07, and if the electronic device responds to the operation of the photographing button 712 at the time 00:07, the electronic device photographs the video image captured at the time 00:07 and stores the photographed image.
In some embodiments, the electronic device may further implement a focusing operation in the process of recording the video in response to an operation of the image by the user. Illustratively, as shown in fig. 7d, in response to the user touching position a on the display screen 151, the electronic device displays a focusing frame 713. It should be noted that the position a may be any position on the user interface 710 other than the function button. In some embodiments, the electronic device may also display a brightness adjustment bar 714 in response to the user touching location a on the display screen 151. The electronic device may adjust the brightness of the display screen 151 in response to an operation of sliding the brightness adjustment button 715 up or down. In addition, the electronic equipment can respond to the zooming gesture operation of the user on the image to realize zooming.
In the embodiment of the present application, the ISP may obtain the resolution selected by the user based on the following manner: in some embodiments, the touch sensor 160A of the electronic device may report information (e.g., coordinates, etc.) operated by the user on the display screen 151 to the application processor after receiving an operation of 2560 × 1080 resolution selected by the user, and the application processor determines to update the resolution of the photographed image to 2560 × 1080 according to the information operated by the user on the display screen 151. The application processor then informs the ISP of the updated resolution 2560 x 1080 so that the ISP can process the images acquired by the rear camera based on the resolution selected by the user. In other embodiments, the touch sensor 160A of the electronic device may also directly report information of the user's operation on the display screen 151 to the ISP, and the ISP may determine the resolution of the photographed image based on the information of the user's operation on the display screen 151. The embodiment of the present application does not limit the manner in which the ISP obtains the resolution selected by the user.
Example two: the image sensor may cut the 4:3 image into a 21:9 image according to the resolution 2560 × 1080 selected by the user and transmit the image to the ISP. After receiving the 21:9 image, the ISP scales the 21:9 image to an image of 2560 × 1080 resolution selected by the user, and then the 2560 × 1080 image is subsequently processed by the image processor and codec. It should be noted that the image sensor may obtain the resolution selected by the user from the ISP, and in addition, the manner in which the image sensor obtains the resolution selected by the user may also refer to the manner in which the ISP obtains the resolution selected by the user in the first embodiment.
Example three: the image sensor may crop the 4:3 image into a W: H image according to the user selected resolution 2560 x 1080, where W: h is 4: a ratio between 3 and 21:9, such as W: h may be 16: 9. The image sensor compares W: the image of H is sent to the ISP. After receiving the W: H image, the ISP crops the W: H image into a 21:9 image according to the resolution 2560 × 1080 selected by the user, scales the 21:9 image into a 2560 × 1080 resolution image, and then performs subsequent processing on the 2560 × 1080 image by the image processor and the codec.
It should be noted that, in the third embodiment, a manner in which the image sensor acquires the resolution selected by the user may be referred to as a manner in which the image sensor acquires the resolution selected by the user in the second embodiment. The manner in which the ISP acquires the resolution selected by the user in the second embodiment and the third embodiment may be referred to as the manner in which the image sensor acquires the resolution selected by the user in the first embodiment. The manner in which the image processor and codec performs subsequent processing on the 2560 × 1080 image in the second embodiment and the third embodiment can be seen in the manner in which the image processor and codec performs subsequent processing on the 2560 × 1080 image in the first embodiment.
It should be noted that, the above is described by taking as an example that the ratio of the image acquired by the lens of the rear camera in the width direction to the image acquired by the lens of the rear camera in the height direction is 4:3, and the ratio of the image imaged after the image sensor in the width direction to the image acquired by the rear camera in the height direction is 4:3, and when the ratio of the image imaged after the image sensor in the width direction to the image acquired by the rear camera in the height direction is smaller than or larger than the aspect ratio of the resolution selected by the user, the shooting method can be referred to embodiments one to three. When the proportion of the width direction to the height direction of an image formed by light rays collected by the lens of the rear camera after the image sensor is equal to the length-width ratio of the resolution selected by the user, the image collected by the rear camera does not need to be cut, and the image collected by the camera is directly zoomed into the image of the resolution selected by the user by the ISP. For example, the resolution selected by the user is 2560 × 1080, and the ratio of the width direction to the height direction of the image formed by the light collected by the lens of the rear camera behind the image sensor is 21:9, the ISP can directly scale the received 21:9 image from the image sensor to the 2560 × 1080 resolution image.
In addition, in some embodiments, the electronic device may further automatically start a skin beautifying function in response to detecting that the rear camera captures a face image of the user. For example, when an image acquired by a rear camera of the electronic device is the image 301 in fig. 3, a skin beautifying function is automatically turned on to beautify the face (e.g., scrub, remove acne, etc.). In addition, taking the user interface 700 shown in fig. 7b as an example, the user interface 700 further includes a skin makeup function button 706, and the electronic device may activate a skin makeup function in response to an operation of the skin makeup function button 706. For example, the electronic device may display a skin-makeup level on the display screen 151 in response to an operation of the skin-makeup function button 706. The user can set the skin beautifying grade according to the needs of the user.
For example, when the electronic device includes a rear camera, in the case that the lens of the rear camera is an anamorphic lens, if the ratio of the width direction to the height direction of the image captured by the lens of the rear camera is 21:9 and the ratio of the width direction to the height direction of the image imaged behind the image sensor is 4:3, the electronic device may capture a video image with a resolution of 2560 × 1080 based on the following method.
Example four: the image sensor may send a 4:3 image to the ISP. After the ISP receives the 4:3 image, the 4:3 image collected by the rear camera is stretched in the width direction according to the resolution 2560X 1080 selected by the user, so that an image with the ratio of the width direction to the height direction being 21:9 is obtained, and then the image is zoomed into the image with the resolution 2560X 1080. The 2560 × 1080 images are then subsequently processed by an image processor and codec.
Example five: the image sensor may stretch the 4:3 image 4:3 in the width direction according to the resolution 2560 × 1080 selected by the user, obtain an image with a ratio of 21:9 in the width direction and the height direction, and transmit the image to the ISP. After receiving the 21:9 image, the ISP scales the 21:9 image to an image of 2560 × 1080 resolution selected by the user, and then the 2560 × 1080 image is subsequently processed by the image processor and codec.
It should be noted that, in the fifth embodiment, a manner in which the image sensor can obtain the resolution selected by the user from the ISP can be referred to as a manner in which the image sensor obtains the resolution selected by the user in the second embodiment. In the fourth and fifth embodiments, the manner in which the ISP acquires the resolution selected by the user may be referred to as the manner in which the image sensor acquires the resolution selected by the user in the first embodiment. The way in which the image processor and codec performs subsequent processing on the 2560 × 1080 image in the fourth embodiment and the fifth embodiment can be seen in the way in which the image processor and codec performs subsequent processing on the 2560 × 1080 image in the first embodiment.
The above description is given by taking an example in which the ratio of the image captured by the anamorphic lens in the width direction to the image captured by the anamorphic lens in the height direction is 21:9, and the ratio of the image imaged on the image sensor in the width direction to the image captured by the anamorphic lens in the height direction is 4: 3. In general, the ratio of the image acquired by the anamorphic lens in the width direction and the height direction is the same as the ratio of the resolution selected by the user, and in this case, the image with the ratio of the resolution process selected by the user can be obtained only by stretching the image imaged on the image sensor. When the ratio W1 of the image captured by the anamorphic lens in the width direction and the height direction: h1 and user selected resolution ratio W2: when H2 is different, the image imaged on the image sensor may be stretched into W1: image of H1, then for W1: the image of H1 was cropped to obtain W2: images of H2. The stretching and clipping operations may be performed by the image sensor, the stretching and clipping operations may be performed by the ISP, the stretching operation may be performed by the image sensor, and the clipping operation may be performed by the ISP.
In some embodiments, when the electronic device switches from the rear camera to the front camera for shooting in response to the operation of the camera switch button 514, if there is only one front camera and the shot of the front camera is a standard shot, the mode of shooting the image may refer to the shooting mode of the rear camera.
In addition, when the electronic device includes a plurality of cameras, if the lenses of the cameras included in the electronic device are the same, for example, all the lenses of the cameras included in the electronic device are standard lenses, the shooting method may refer to the related descriptions in the first to third embodiments. When the cameras included in the electronic device are all anamorphic lenses, the shooting method can refer to the related descriptions in the fourth embodiment and the fifth embodiment.
In some embodiments, the electronic device may include cameras of different lenses. For example, as shown in fig. 8, the electronic apparatus includes a rear camera 801 and a rear camera 802. The lens of the rear camera 801 is a standard lens, and the lens of the rear camera 802 is an anamorphic lens. In this case, when the electronic device uses the rear camera to shoot, the rear camera can be started according to the resolution selected by the user. The shooting method of the electronic device after the rear camera is started can be seen in the first to fifth embodiments.
The rear camera 801 is a standard lens, the ratio of the image acquired by the standard lens in the width direction to the image acquired by the standard lens in the height direction is 4:3, and the ratio of the image imaged behind the image sensor in the width direction to the image acquired by the standard lens in the height direction is 4: 3; the lens of the rear camera 802 is an anamorphic lens, and the ratio of the anamorphic lens to the width direction and the height direction of the acquired image is 21:9, and the ratio of the image imaged behind the image sensor to the width direction and the height direction is 4: 3.
In some embodiments, the electronic device may activate the rear-facing camera based on a user selected resolution. For example, the electronic device may activate rear-facing camera 801 when the aspect ratio of the resolution selected by the user is 4: 3. In some embodiments, the electronic device may also prompt the user for a started rear camera. For example, the electronic device may prompt the user that the lens of the rear camera currently used is a standard lens in a voice manner. As another example, the electronic device may also display a prompt box on the display screen 151. The prompt box includes a prompt that the lens of the camera currently used by the user is an anamorphic lens. For example, as shown in fig. 9, the electronic device displays a prompt box 900 on the display screen 151, where the prompt box 900 includes a prompt message "the lens of the rear camera that is started is a standard lens". In other embodiments, the electronic device activates rear-facing camera 802 when the aspect ratio of the resolution selected by the user is 21: 9. The technical scheme is beneficial to enabling the visual field range of the 21:9 image shot by the electronic equipment to be larger. Illustratively, the electronic device activates the rear camera 802 in response to a user selecting a resolution with an aspect ratio of 21: 9. In this case, the electronic device may prompt the user that the lens of the rear camera currently used is the anamorphic lens. The specific prompting mode may refer to a prompting mode for prompting that a lens of a rear camera currently used by a user is a standard lens.
It should be noted that, since the aspect ratio of the resolution supported by the electronic device may include a plurality of types, for example, 16:9, 4:3, 2:1, 21:9, 2.37:1(64:27), 2.39:1 (12: 5), 2.55:1 (23: 9), 2.59:1 (13: 5), 2.66:1(8:3, 24:9), etc., the number of cameras configured on the electronic device is limited, currently, a maximum of 3 rear cameras are configured on the electronic device, and in some embodiments, the electronic device may start a camera with a standard lens when the aspect ratio of the resolution selected by the user is smaller than the first threshold; and when the resolution selected by the user is greater than or equal to the first threshold, starting the camera with the lens as the anamorphic lens. The first threshold may be set accordingly as needed, for example, the first threshold may be set to 2, may also be set to 7/3, and the like. Optionally, a lens in the rear camera configured on the electronic device includes a first lens, and when a ratio of an image acquired by the first lens in the width direction and the height direction is the same as a length-width ratio of a resolution selected by a user, the rear camera including the first lens is started. Therefore, the subsequent image processing is facilitated to be simplified, and the image processing efficiency is improved.
It should also be noted that in some embodiments, the electronic device may also turn on different cameras in response to different modes. The electronic device takes a picture by using a rear camera, and the rear camera included in the electronic device is as shown in fig. 8. For example, when the aspect ratio of the resolution selected by the user is 21:9 and the video mode selected by the user is the portrait mode, the electronic device may turn on the rear camera 802 and the rear camera 801, where a lens of the rear camera 802 is an anamorphic lens and a lens of the rear camera 801 is a standard lens. When the electronic device is used for shooting, the image collected by the rear camera 802 can be processed to obtain an image with an aspect ratio of the resolution selected by the user, and then the processed image selected by the user and the image collected by the rear camera 802 are synthesized, so that people in the shot video can be more prominent. The above description is made only by taking the portrait mode as an example of the manner of turning on the rear camera. For different video modes, the mode of turning on the camera may be set based on actual conditions, which is not limited herein.
In addition, when the electronic device uses the front camera to shoot, the shooting method can refer to the shooting mode of the rear camera.
The embodiment of the application can also be applied to a photographing scene. Take the example of the user interface 510 shown in fig. 5. The electronic device is based on the user-set resolution of the camera captured image displayed at the user interface 510. For example, the user may set the image resolution by taking a picture on a setting interface. In some embodiments, user interface 520 includes a set button 521. The electronic device may display a setting interface for photographing in response to an operation of the setting button 515 by the user. In addition, the electronic equipment can also respond to the leftward or rightward sliding operation on the photographing preview interface, and the setting interface for photographing can be displayed. In this embodiment of the application, the electronic device may further display a setting interface for taking a picture in response to other operations (for example, operations such as pull-up or pull-down). For example, the settings interface for the photograph can be as shown in the user interface 1010 of FIG. 10. The user interface 610 includes a resolution setting button 1011. It can be seen from the resolution setting button 1011 that the resolution of the image currently selected by the user is 3968 × 2976, wherein in the case of the resolution of 3968 × 2976, the aspect ratio is 4: 3. The user interface 510 displays a video image with a resolution having an aspect ratio of 4: 3.
The electronic device may display a user interface 1020 on the display screen 151 in response to operation of the resolution option button 1011. The user interface 1020 includes options for the resolution of the plurality of images supported by the electronic device. The user can select the corresponding resolution according to the requirement of the user. For example, the electronic device may set the resolution of the image at the time of taking a picture to 2560 × 1080 in response to the user selecting an option of 2560 × 1080 resolution. The electronic device may display a preview interface for taking a picture, such as the user interface described in fig. 11, in response to the user selecting the option with a resolution of 2560 x 1080.
The method for taking a picture by the electronic device may specifically refer to a method for taking a video by the electronic device.
It should be understood that each of the embodiments of the present application can be used alone or in combination with each other.
With reference to the foregoing embodiments and the accompanying drawings, embodiments of the present application provide a shooting method, which can be implemented in an electronic device having a hardware structure shown in fig. 1.
As shown in fig. 12, a flowchart of a shooting method provided in the embodiment of the present application is schematically illustrated. Comprises the following steps.
For a specific implementation of the shooting method shown in fig. 12, reference may be made to the related descriptions of the above embodiments.
In the embodiments provided in the present application, the method provided in the embodiments of the present application is described from the perspective of an electronic device as an execution subject. In order to implement the functions in the method provided by the embodiments of the present application, the electronic device may include a hardware structure and/or a software module, and the functions are implemented in the form of a hardware structure, a software module, or a hardware structure and a software module. Whether any of the above-described functions is implemented as a hardware structure, a software module, or a hardware structure plus a software module depends upon the particular application and design constraints imposed on the technical solution.
As shown in fig. 13, an embodiment of the present application discloses an electronic device 1300, where the electronic device 1000 may include: a display screen 1301, a camera 1302, one or more processors 1303, memory 1304, and a plurality of applications 1305; and one or more computer programs 1306, which may be connected via one or more communication buses 1307. Wherein the one or more computer programs 1306 are stored in the memory 1304 and configured to be executed by the one or more processors 1302 to implement the method for capturing shown in fig. 12 in the embodiment of the present application.
The processors referred to in the various embodiments above may be general purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software modules may be located in a Random Access Memory (RAM), a flash memory, a read-only memory (ROM), a programmable ROM, an electrically erasable programmable memory, a register, or other storage media that are well known in the art. The storage medium is located in a memory, and a processor reads instructions in the memory and combines hardware thereof to complete the steps of the method.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, a network device, or the like) or a processor (processor) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a U disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk.
In short, the above description is only an example of the present application, and is not intended to limit the scope of the present application. Any modifications, equivalents, improvements and the like made in accordance with the disclosure of the present application are intended to be included within the scope of the present application.
Claims (8)
1. A shooting method is applied to electronic equipment, wherein the electronic equipment comprises a first camera, a second camera and a display screen, and the method comprises the following steps:
displaying a first video image acquired by a first camera, wherein the resolution of the first video image is a first resolution; displaying a preview interface of a first video, wherein the resolution of the first video is a first resolution;
setting the resolution of the video image from the first resolution to a second resolution in response to a user operation; the aspect ratio of the second resolution is greater than the aspect ratio of the first resolution, the aspect ratio of the second resolution is greater than a preset threshold, and the second resolution is preset 21:9, 2.37:1, 2.39: 1. one of 2.55:1, 2.59:1 or 2.66: 1;
and displaying a video image acquired by a second camera, wherein the resolution of the second video image is a second resolution, the second video is obtained by cutting or stretching the image acquired by the second camera according to the second resolution, and the range of the image acquired by the second camera is larger than that of the image acquired by the first camera.
2. The method of claim 1, wherein the first video is obtained based on the first resolution for the image captured by the first camera when the first resolution is less than or equal to the preset threshold.
3. The method of claim 2, wherein after setting the resolution of the video to the second resolution, the method further comprises:
switching from the first camera to the second camera.
4. The method of claim 3, wherein the method further comprises:
the display screen displays prompt information, and the prompt information is used for prompting a user to switch the camera from the second camera to the first camera.
5. The method of any of claims 1 to 4, wherein the aspect ratio of the first resolution is 4: 3. 16:9, or 2: 1.
6. an electronic device, comprising: a display screen, one or more processors, a memory, a camera;
a plurality of application programs;
and one or more computer programs;
the display screen is used for displaying a user interface;
the camera is used for collecting images;
the one or more computer programs stored in the memory, which when executed by the electronic device, cause the electronic device to implement the method of any of claims 1 to 6.
7. A chip, characterized in that the chip is coupled with a memory in an electronic device, such that the chip, when running, invokes a computer program stored in the memory, implementing the method according to any of claims 1 to 5.
8. A computer storage medium, characterized in that the computer-readable storage medium comprises a computer program which, when run on the electronic device, causes the electronic device to perform the method of any of claims 1 to 5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210022150.7A CN114390199A (en) | 2018-10-15 | 2018-10-15 | Shooting method and electronic equipment |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811198702.XA CN111050062B (en) | 2018-10-15 | 2018-10-15 | Shooting method and electronic equipment |
CN202210022150.7A CN114390199A (en) | 2018-10-15 | 2018-10-15 | Shooting method and electronic equipment |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811198702.XA Division CN111050062B (en) | 2018-10-15 | 2018-10-15 | Shooting method and electronic equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114390199A true CN114390199A (en) | 2022-04-22 |
Family
ID=70230387
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210022150.7A Pending CN114390199A (en) | 2018-10-15 | 2018-10-15 | Shooting method and electronic equipment |
CN201811198702.XA Active CN111050062B (en) | 2018-10-15 | 2018-10-15 | Shooting method and electronic equipment |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811198702.XA Active CN111050062B (en) | 2018-10-15 | 2018-10-15 | Shooting method and electronic equipment |
Country Status (2)
Country | Link |
---|---|
CN (2) | CN114390199A (en) |
WO (1) | WO2020078273A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115022545A (en) * | 2022-06-02 | 2022-09-06 | 北京字跳网络技术有限公司 | Method, apparatus, device and storage medium for content shooting |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114363820A (en) * | 2020-09-29 | 2022-04-15 | 华为终端有限公司 | Electronic equipment searching method and electronic equipment |
CN115484385B (en) * | 2021-06-16 | 2023-12-08 | 荣耀终端有限公司 | Video shooting method and electronic equipment |
CN117609547A (en) * | 2021-08-12 | 2024-02-27 | 荣耀终端有限公司 | Video thumbnail display method, apparatus and storage medium |
CN116709016B (en) * | 2022-02-24 | 2024-06-18 | 荣耀终端有限公司 | Multiplying power switching method and multiplying power switching device |
CN115361468B (en) * | 2022-10-21 | 2023-02-28 | 荣耀终端有限公司 | Display optimization method and device during screen rotation and storage medium |
CN117692693A (en) * | 2023-06-09 | 2024-03-12 | 荣耀终端有限公司 | Multi-screen display method and related equipment |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050146631A1 (en) * | 2004-01-07 | 2005-07-07 | Shelton Michael J. | In-camera cropping to standard photo sizes |
US20060221198A1 (en) * | 2005-03-31 | 2006-10-05 | Jared Fry | User established variable image sizes for a digital image capture device |
US9113043B1 (en) * | 2011-10-24 | 2015-08-18 | Disney Enterprises, Inc. | Multi-perspective stereoscopy from light fields |
CN105100607A (en) * | 2015-06-26 | 2015-11-25 | 努比亚技术有限公司 | Shooting device and method |
CN105141841A (en) * | 2015-08-25 | 2015-12-09 | 上海兆芯集成电路有限公司 | Camera equipment and method therefor |
CN105637855A (en) * | 2013-08-22 | 2016-06-01 | 高途乐公司 | Conversion between aspect ratios in camera |
WO2016145831A1 (en) * | 2015-09-08 | 2016-09-22 | 中兴通讯股份有限公司 | Image acquisition method and device |
CN108616741A (en) * | 2016-12-06 | 2018-10-02 | 深圳市谛源光科有限公司 | A kind of 3D filming apparatus and method applied in mobile terminal |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8561106B1 (en) * | 2007-12-21 | 2013-10-15 | Google Inc. | Video advertisement placement |
CN101217643B (en) * | 2007-12-26 | 2010-12-29 | 广东威创视讯科技股份有限公司 | A method and corresponding device for dynamic capture and collection, display of images with different sizes and resolution |
US8185822B2 (en) * | 2008-02-11 | 2012-05-22 | Apple Inc. | Image application performance optimization |
CN102984494B (en) * | 2012-12-06 | 2015-11-25 | 小米科技有限责任公司 | A kind of video communication method and device |
US20150348325A1 (en) * | 2014-05-27 | 2015-12-03 | Thomson Licensing | Method and system for stabilization and reframing |
CN106293589B (en) * | 2016-08-31 | 2019-04-12 | 宇龙计算机通信科技(深圳)有限公司 | A kind of method, apparatus and terminal that preview resolution is set |
CN107395970A (en) * | 2017-07-27 | 2017-11-24 | 深圳市泰衡诺科技有限公司 | A kind of photographic method and camera arrangement for intelligent terminal |
CN108040204B (en) * | 2017-12-05 | 2020-06-02 | 北京小米移动软件有限公司 | Image shooting method and device based on multiple cameras and storage medium |
CN108513067B (en) * | 2018-03-29 | 2021-01-08 | 维沃移动通信有限公司 | Shooting control method and mobile terminal |
-
2018
- 2018-10-15 CN CN202210022150.7A patent/CN114390199A/en active Pending
- 2018-10-15 CN CN201811198702.XA patent/CN111050062B/en active Active
-
2019
- 2019-10-11 WO PCT/CN2019/110618 patent/WO2020078273A1/en active Application Filing
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050146631A1 (en) * | 2004-01-07 | 2005-07-07 | Shelton Michael J. | In-camera cropping to standard photo sizes |
US20060221198A1 (en) * | 2005-03-31 | 2006-10-05 | Jared Fry | User established variable image sizes for a digital image capture device |
US9113043B1 (en) * | 2011-10-24 | 2015-08-18 | Disney Enterprises, Inc. | Multi-perspective stereoscopy from light fields |
CN105637855A (en) * | 2013-08-22 | 2016-06-01 | 高途乐公司 | Conversion between aspect ratios in camera |
CN105100607A (en) * | 2015-06-26 | 2015-11-25 | 努比亚技术有限公司 | Shooting device and method |
CN105141841A (en) * | 2015-08-25 | 2015-12-09 | 上海兆芯集成电路有限公司 | Camera equipment and method therefor |
WO2016145831A1 (en) * | 2015-09-08 | 2016-09-22 | 中兴通讯股份有限公司 | Image acquisition method and device |
CN108616741A (en) * | 2016-12-06 | 2018-10-02 | 深圳市谛源光科有限公司 | A kind of 3D filming apparatus and method applied in mobile terminal |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115022545A (en) * | 2022-06-02 | 2022-09-06 | 北京字跳网络技术有限公司 | Method, apparatus, device and storage medium for content shooting |
WO2023231901A1 (en) * | 2022-06-02 | 2023-12-07 | 北京字跳网络技术有限公司 | Method and apparatus for content photographing, and device and storage medium |
CN115022545B (en) * | 2022-06-02 | 2024-04-02 | 北京字跳网络技术有限公司 | Method, apparatus, device and storage medium for content shooting |
US12114060B2 (en) | 2022-06-02 | 2024-10-08 | Beijing Zitiao Network Technology Co., Ltd. | Method, apparatus, device and storage medium for content capturing |
Also Published As
Publication number | Publication date |
---|---|
CN111050062B (en) | 2022-01-14 |
CN111050062A (en) | 2020-04-21 |
WO2020078273A1 (en) | 2020-04-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110072070B (en) | Multi-channel video recording method, equipment and medium | |
CN111050062B (en) | Shooting method and electronic equipment | |
US11832022B2 (en) | Framing method for multi-channel video recording, graphical user interface, and electronic device | |
KR102385841B1 (en) | shooting mobile terminal | |
CN113727016A (en) | Shooting method and electronic equipment | |
WO2020073959A1 (en) | Image capturing method, and electronic device | |
CN112449099B (en) | Image processing method, electronic equipment and cloud server | |
CN112425156B (en) | Method for selecting images based on continuous shooting and electronic equipment | |
CN114205515B (en) | Anti-shake processing method for video and electronic equipment | |
CN113596316B (en) | Photographing method and electronic equipment | |
CN113596319A (en) | Picture-in-picture based image processing method, apparatus, storage medium, and program product | |
CN114489533A (en) | Screen projection method and device, electronic equipment and computer readable storage medium | |
CN105744170A (en) | Picture photographing device and method | |
CN113596321A (en) | Transition dynamic effect generation method, apparatus, storage medium, and program product | |
CN113965693B (en) | Video shooting method, device and storage medium | |
CN112637481B (en) | Image scaling method and device | |
CN114500901A (en) | Double-scene video recording method and device and electronic equipment | |
CN112422805B (en) | Shooting method and electronic equipment | |
CN114363678A (en) | Screen projection method and equipment | |
CN113923351B (en) | Method, device and storage medium for exiting multi-channel video shooting | |
CN116782024A (en) | Shooting method and electronic equipment | |
CN116069156A (en) | Shooting parameter adjusting method, electronic equipment and storage medium | |
CN111294509A (en) | Video shooting method, device, terminal and storage medium | |
CN117714849A (en) | Image shooting method and related equipment | |
CN116055853A (en) | Shooting method and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20220422 |