CN112532832A - Imaging device and electronic equipment - Google Patents
Imaging device and electronic equipment Download PDFInfo
- Publication number
- CN112532832A CN112532832A CN202011323886.5A CN202011323886A CN112532832A CN 112532832 A CN112532832 A CN 112532832A CN 202011323886 A CN202011323886 A CN 202011323886A CN 112532832 A CN112532832 A CN 112532832A
- Authority
- CN
- China
- Prior art keywords
- bands
- pixel
- sub
- filter
- main
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/10—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/54—Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/55—Optical parts specially adapted for electronic image sensors; Mounting thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/10—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
- H04N25/11—Arrangement of colour filter arrays [CFA]; Filter mosaics
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
Abstract
The utility model provides an imaging device and an electronic device, which relate to the technical field of optical imaging and comprise a light splitter and an image detector; the optical splitter is provided with n main wave bands and is used for separating a light beam of one main wave band from the light beam to project the light beam to the image detector, and n is a positive integer greater than or equal to 2; the image detector comprises a pixel filter and an image sensor; the pixel filter is formed by arranging m pixel filter units, wherein m is a positive integer greater than or equal to 2; each pixel filtering unit has n sub-bands, wherein the ith sub-band is within the ith main band of the n main bands, the ith sub-bands of any two pixel filtering units are different, and i is any positive integer within [1, n ]; the pixel filter is used for filtering the light beam of one main waveband into light beams of m sub-wavebands to project the light beams to the image sensor; the image sensor is used for outputting m sub-wave band images corresponding to the m sub-wave band light beams. The multispectral image can be rapidly and effectively acquired.
Description
Technical Field
The present disclosure relates to the field of optical imaging technologies, and in particular, to an imaging device and an electronic apparatus.
Background
The multispectral imaging technology is to divide the incident full-wave band or wide-wave band light signal into several narrow-wave band light beams, and then image them on the corresponding detectors respectively, so as to obtain images of different spectral wave bands. With the increase of the demand of people for multispectral imaging, the fineness of the spectral resolution capability is also required to be higher, so how to divide the spectrum to be narrower and obtain images of more wave bands becomes a problem of wide attention of people.
In the prior art, a combination of a plurality of lenses or cameras with different optical filters is usually used to obtain a multispectral image. However, due to the characteristics of the hardware structure, it is often difficult to align the same shot object by using the above method, the overlapping accuracy is poor, and the imaging quality is affected.
Disclosure of Invention
The present disclosure provides an imaging device and an electronic apparatus, so as to improve, at least to a certain extent, a problem that a multispectral imaging device in the prior art is difficult to accurately and efficiently acquire a multispectral image.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to a first aspect of the present disclosure, there is provided an imaging apparatus comprising a beam splitter and an image detector; the optical splitter is provided with n main wave bands and is used for separating a light beam of one main wave band from the light beam to project the light beam to the image detector, and n is a positive integer greater than or equal to 2; the image detector comprises a pixel optical filter and an image sensor; the pixel filter is formed by arranging m pixel filter units, wherein m is a positive integer greater than or equal to 2; each pixel filtering unit has n sub-bands, wherein the ith sub-band is within the ith main band of the n main bands, the ith sub-bands of any two pixel filtering units are different, and i is any positive integer within [1, n ]; the pixel filter is used for filtering the light beam of the main wave band into light beams of m sub wave bands so as to project the light beams to the image sensor; the image sensor is used for outputting m sub-wave band images corresponding to the m sub-wave band light beams.
According to a second aspect of the present disclosure, there is provided an electronic apparatus including the above imaging device.
The technical scheme of the disclosure has the following beneficial effects:
according to the imaging device and the electronic equipment, the imaging device is provided and comprises a light splitter and an image detector; the optical splitter is provided with n main wave bands and is used for separating a light beam of one main wave band from the light beam to project the light beam to the image detector, and n is a positive integer greater than or equal to 2; the image detector comprises a pixel filter and an image sensor; the pixel filter is formed by arranging m pixel filter units, wherein m is a positive integer greater than or equal to 2; each pixel filtering unit has n sub-bands, wherein the ith sub-band is within the ith main band of the n main bands, the ith sub-bands of any two pixel filtering units are different, and i is any positive integer within [1, n ]; the pixel filter is used for filtering the light beam of one main waveband into light beams of m sub-wavebands to project the light beams to the image sensor; the image sensor is used for outputting m sub-wave band images corresponding to the m sub-wave band light beams. In one aspect, the present exemplary embodiment provides a new imaging apparatus, which uses a splitter having n main bands and a pixel filtering unit having n sub-bands, so that when a pixel filter has m pixel filtering unit arrangements, m sub-band images can be obtained without configuring a plurality of filters or cameras, thereby simplifying a hardware structure; on the other hand, the imaging device avoids the problems of poor image overlapping precision and poor imaging quality caused by difficulty in aligning the shot images with the shot objects when a plurality of filters or cameras are arranged; on the other hand, based on the imaging device, images of more spectral bands can be obtained only by switching the main wave band and the sub wave band, the imaging process is simple and convenient, and the application range is wide.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty.
Fig. 1 is a schematic view showing a structure of an image forming apparatus in the related art;
FIG. 2 is a schematic diagram showing waveforms of transmission bands of a filter in the related art;
fig. 3 is a schematic view showing a pixel filter in the related art;
FIG. 4 is a diagram illustrating waveforms of transmission bands of a pixel filtering unit in the prior art;
fig. 5 shows a schematic diagram of the electronic device of the present exemplary embodiment;
fig. 6 shows a schematic configuration diagram of an image forming apparatus in the present exemplary embodiment;
FIG. 7 is a waveform diagram illustrating the transmission band of a prefilter according to this exemplary embodiment;
fig. 8 is a schematic view showing a pixel filter in the present exemplary embodiment;
fig. 9 is a waveform diagram showing a transmission wavelength band of a pixel filtering unit in the present exemplary embodiment;
fig. 10 is a diagram showing a waveform of an imaging device through a total wavelength band in the present exemplary embodiment;
fig. 11 shows a schematic configuration diagram of another image forming apparatus in the present exemplary embodiment.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and the like. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
In an imaging apparatus of the related art, as shown in fig. 1, the imaging apparatus 100 includes a lens 110, an optical filter 120, and an image detector 130, wherein the image detector 130 includes a pixel filter 131 and an image sensor 132. The lens 110 is disposed between the object and the filter 120, and is used for transmitting a polychromatic light beam related to the object; the optical filter 120 is configured to split the transmitted light beam to filter the light beam in a main Wavelength range, as shown in fig. 2, fig. 2 shows a waveform diagram of the main Wavelength range transmitted by the optical filter 120, where an abscissa is Wavelength (Wavelength) and a unit is nanometer (nm), and an ordinate is Transmittance (Transmittance); the pixel filter 131 is composed of different pixel filter units, as shown in fig. 3, the pixel filter 131 includes 3 pixel filter units of Green (Green, G), Red (Red, R), and Blue (Blue, B), which can filter light beams of three sub-bands, fig. 4 shows a waveform diagram of each sub-band, the Green pixel filter unit corresponds to sub-band 2, the Red pixel filter unit corresponds to sub-band 3, and the Blue pixel filter unit corresponds to sub-band 1; the image sensor 132 is used for outputting a spectral image corresponding to the filtered light beam. In the imaging apparatus shown in fig. 1, the wavelengths of the three sub-bands are within the wavelength of the main band, and an image including three spectral bands can be finally obtained.
Exemplary embodiments of the present disclosure provide an electronic device for implementing an imaging apparatus. The electronic device may be implemented in various forms, and may include, for example, a mobile device such as a mobile phone, a tablet computer, a notebook computer, a Personal Digital Assistant (PDA), a navigation device, a wearable device, an unmanned aerial vehicle, and a stationary device such as a desktop computer and a smart television.
The following takes the mobile terminal 500 in fig. 5 as an example, and exemplifies the configuration of the electronic device. It will be appreciated by those skilled in the art that the configuration of figure 5 can also be applied to fixed type devices, in addition to components specifically intended for mobile purposes. In other embodiments, mobile terminal 500 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware. The interfacing relationship between the various components is shown schematically and does not constitute a structural limitation of the mobile terminal 500. In other embodiments, the mobile terminal 500 may interface differently than shown in FIG. 5, or a combination of multiple interfaces.
As shown in fig. 5, the mobile terminal 500 may specifically include: the mobile phone includes a processor 510, an internal memory 521, an external memory interface 522, a USB interface 530, a charging management Module 540, a power management Module 541, a battery 542, an antenna 1, an antenna 2, a mobile communication Module 550, a wireless communication Module 560, an audio Module 570, a speaker 571, a receiver 572, a microphone 573, an earphone interface 574, a sensor Module 580, a display screen 590, a camera Module 591, an indicator 592, a motor 593, a key 594, a Subscriber Identity Module (SIM) card interface 595, and the like.
Processor 510 may include one or more processing units, such as: processor 510 may include an Application Processor (AP), a modem Processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, an encoder, a decoder, a Digital Signal Processor (DSP), a baseband Processor, and/or a Neural Network Processor (NPU), among others. The different processing units may be separate devices or may be integrated into one or more processors. The encoder may encode (i.e., compress) the image or video data to form code stream data; the decoder may decode (i.e., decompress) the codestream data of the image or video to restore the image or video data.
In some implementations, processor 510 may include one or more interfaces. The Interface may include an Integrated Circuit (I2C) Interface, an Inter-Integrated Circuit built-in audio (I2S) Interface, a Pulse Code Modulation (PCM) Interface, a Universal Asynchronous Receiver/Transmitter (UART) Interface, a Mobile Industry Processor Interface (MIPI), a General-Purpose Input/Output (GPIO) Interface, a Subscriber Identity Module (SIM) Interface, and/or a Universal Serial Bus (USB) Interface, etc. Connections are made with other components of mobile terminal 500 through different interfaces.
The USB interface 530 is an interface conforming to the USB standard specification, and may specifically be a MiniUSB interface, a microsusb interface, a USB type c interface, or the like. The charging management module 540 is configured to receive charging input from a charger. The charging management module 540 may also provide power to the device via the power management module 541 while charging the battery 542. The power management module 541 is used to connect the battery 542, the charging management module 540 and the processor 510. The power management module 541 receives input from the battery 542 and/or the charge management module 540, provides power to various portions of the mobile terminal 500, and may also be used to monitor the status of the battery.
The wireless communication function of the mobile terminal 500 may be implemented by the antenna 1, the antenna 2, the mobile communication module 550, the wireless communication module 560, a modem processor, a baseband processor, and the like. The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. The mobile communication module 550 may provide a solution including 2G/3G/4G/5G wireless communication applied on the mobile terminal 500.
The Wireless communication module 560 may provide Wireless communication solutions including a Wireless Local Area Network (WLAN) (e.g., a Wireless Fidelity (Wi-Fi) network), Bluetooth (BT), a Global Navigation Satellite System (GNSS), Frequency Modulation (FM), etc., which are applied to the mobile terminal 500. The wireless communication module 560 may be one or more devices integrating at least one communication processing module. The wireless communication module 560 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 510. The wireless communication module 560 may also receive a signal to be transmitted from the processor 510, frequency-modulate it, amplify it, and convert it into electromagnetic waves via the antenna 2 to radiate it.
In some embodiments, the antenna 1 of the mobile terminal 500 is coupled to the mobile communication module 550 and the antenna 2 is coupled to the wireless communication module 560, such that the mobile terminal 500 can communicate with networks and other devices through wireless communication technologies. The wireless communication technology may include Global System for Mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), WLAN, FM, and the like.
The mobile terminal 500 implements a display function through the GPU, the display screen 590, the application processor, and the like. The GPU is used to perform mathematical and geometric calculations to achieve graphics rendering and to connect the display screen 590 with the application processor. Processor 510 may include one or more GPUs that execute program instructions to generate or alter display information. The mobile terminal 500 may include one or more display screens 590 for displaying images, videos, and the like.
The mobile terminal 500 may implement a photographing function through the ISP, the camera module 591, the encoder, the decoder, the GPU, the display screen 590, the application processor, and the like.
The camera module 591 is configured to capture still images or videos, collect optical signals through the photosensitive element, and convert the optical signals into electrical signals. The ISP is used for processing the data fed back by the camera module 591 and converting the electric signal into a digital image signal.
The external memory interface 522 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capabilities of the mobile terminal 500.
The internal memory 521 may be used to store computer-executable program code, which includes instructions. The internal memory 521 may include a program storage area and a data storage area.
The mobile terminal 500 may implement audio functions through the audio module 570, the speaker 571, the receiver 572, the microphone 573, the earphone interface 574, the application processor, and the like. The audio module 570 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 570 may also be used to encode and decode audio signals. A speaker 571 for converting the audio electric signal into a sound signal. The receiver 572 is configured to convert the audio electrical signal into an acoustic signal. The microphone 573 converts a sound signal into an electric signal. The earphone interface 574 is used to connect wired earphones.
The sensor module 580 may include a touch sensor 5801, a pressure sensor 5802, a gyro sensor 5803, an air pressure sensor 5804, and the like. In addition, sensors with other functions, such as a depth sensor, an acceleration sensor, a distance sensor, etc., may be disposed in the sensor module 580 according to actual needs.
Indicator 592 can be an indicator light that can be used to indicate a charge status, a charge change, a message, a missed call, a notification, etc.
The motor 593 may generate a vibration prompt, such as a vibration prompt for incoming calls, alarm clocks, receiving messages, etc., and may also be used for touch vibration feedback, etc.
The keys 594 include a power-on key, a volume key, and the like. The keys 594 may be mechanical keys. Or may be touch keys. The mobile terminal 500 may receive a key input, and generate a key signal input related to user setting and function control of the mobile terminal 500.
The mobile terminal 500 may support one or more SIM card interfaces 595 for connecting to a SIM card, so that the mobile terminal 500 interacts with a network through the SIM card to implement functions such as communication and data communication.
The following describes in detail an imaging apparatus according to an exemplary embodiment of the present disclosure, and application scenarios of the present exemplary embodiment include, but are not limited to: the multispectral image is obtained to analyze curve change so as to assist automatic white balance, identification of gems or medicinal materials, skin health management and the like.
Fig. 6 shows a schematic configuration diagram of an image forming apparatus in the present exemplary embodiment. The apparatus 600 may include a beam splitter 610 and an image detector 620.
The optical splitter 610 is configured to perform optical splitting processing on the polychromatic light, and the light beam may be split by the optical splitter to obtain a light beam with a specific wavelength, for example, the optical splitter may be an optical filter, and different optical filters may pass light with different spectral bands. For example, in the imaging device shown in fig. 1, the filter 120 can only provide a fixed wavelength, and the waveform is shown in fig. 2, i.e., the filter 120 has a main wavelength band.
In the present exemplary embodiment, the beam splitter may have n primary wavelength bands, and a light beam of one primary wavelength band may be separated from the light beam by the beam splitter to be projected to the image detector, where n is a positive integer greater than or equal to 2. That is, the beam splitter may have a characteristic of passing light of n different spectral bands and separate a light beam of one of the main bands from the light beam under different conditions, respectively.
In an exemplary embodiment, the optical splitter may include n splitting states, each corresponding to n primary bands;
when the optical splitter is in the jth light splitting state, a light beam of the jth main wave band is separated from the light beam; j is any positive integer in [1, n ].
Since the optical splitter may have n main bands, in practical application, a light beam of one main band may be separated from n main bands under different conditions, that is, the optical splitter may perform a splitting process of different main bands in different splitting states. The splitting state refers to preset splitting conditions for splitting different main bands, specifically, a voltage condition may be used as the splitting state, and different main bands may be correspondingly split by setting different voltage values or voltage ranges, for example, when a voltage of 10V is supplied, a main band with a shorter wavelength may be passed; when 15V voltage is supplied, a main wavelength band with a longer wavelength can be passed, wherein, the supply of 10V voltage and 15V voltage is two light splitting states, in the 1 st light splitting state, a light beam with the 1 st main wavelength band is separated, and in the 2 nd light splitting state, a light beam with the 2 nd main wavelength band is separated.
In addition, the splitting state may also be set based on other conditions, for example, through a specific application scenario, for example, the application scenario 1 assists automatic white balance as the 1 st splitting state, and the application scenario 2 jewel is identified as the 2 nd splitting state, then when the current application scenario 1 is determined, the light beam in the 1 st primary wavelength band is split, when the current application scenario 2 is determined, the light beam in the 2 nd primary wavelength band is split, and the like, where switching of the splitting state corresponding to the application scenario may be preset, for example, corresponding voltage values or voltage ranges or other parameters and the like are preset for the application scenario 1 and the application scenario 2, respectively, and the disclosure does not specifically limit this.
In an exemplary embodiment, the beam splitter may be a pre-filter.
The prefilter may be configured to be capable of switching filters that transmit different primary bands. For example, a fabry-perot interferometer may be used as a pre-filter to separate a main band of light from the light beam by switching the voltage; or a diffraction grating can be used, and light is dispersed by a multi-package diffraction principle to separate the main wave bands; further alternatively, the electrochromic sheet may be used for stacking to perform light splitting and the like, which is not particularly limited in the present disclosure.
In particular, in the present exemplary embodiment, the prefilter may be a fabry-perot interference layer and electrodes located at two sides of the fabry-perot interference layer; the n splitting states are n different voltages applied to the electrodes, and are used for enabling the Fabry-Perot interference layer to filter light beams in different main wave bands.
That is, the present exemplary embodiment may use a fabry-perot interferometer as the prefilter, and at this time, the prefilter may be switched to transmit different main bands by supplying different voltages, and the switching of the two main bands is exemplified, as shown in fig. 7, when the fabry-perot interferometer is used as the prefilter, a schematic waveform diagram of the separated main bands is shown, and when a voltage of 10V is supplied, a waveform of a light with a shorter wavelength may be passed, where the waveform is a; when a voltage of 15V is supplied, light of a longer wavelength, whose wavelength is B, can pass. In practical application, a user can adjust the supply voltage of the fabry-perot interferometer according to actual requirements, for example, the supply voltage is directly communicated with a power supply chip or the feedback resistance of a power supply is directly configured through a switch, so that the purpose of changing the voltage is achieved, and different types of oscillograms are obtained. The trigger condition of the voltage switching can also be self-defined according to the mode required by the user, and corresponding exposure is carried out according to which wave bands are required. The voltages are merely exemplary, and other values or ranges of voltages may be supplied as needed, or a boost chip or the like may be used, and conditions for switching light passing through more main wavelength bands may be set according to different application scenarios.
The image detector 620 includes a pixel filter 621 and an image sensor 622.
The pixel filter is formed by arranging m pixel filter units, wherein m is a positive integer greater than or equal to 2. The pixel filter, i.e. the filter microlens in the detector, may be divided into a plurality of filter units, and as shown in the schematic diagram of the pixel filter shown in fig. 3, the pixel filter 131 includes 3 kinds of pixel filter units of red, green and blue. In the present exemplary embodiment, the pixel filter may have more kinds of pixel filter units, and the pixel filter shown in fig. 8 may include 9 kinds of pixel filter units.
Each pixel filtering unit has n sub-bands, wherein the ith sub-band is within the ith main band of the n main bands, the ith sub-bands of any two pixel filtering units are different, and i is any positive integer within [1, n ]. The pixel filter is used for filtering the light beam of one main waveband into light beams of m sub-wavebands to project the light beams to the image sensor, and the image sensor is used for outputting m sub-waveband images corresponding to the light beams of the m sub-wavebands.
The sub-bands are similar to the main band and are all bands filtered by the filter, except that the main band is a band transmitted by the pre-filter, and the sub-bands are bands transmitted by the pixel filtering unit. In the exemplary embodiment, each pixel filter unit has a plurality of sub-bands, that is, the pixel filter unit has a characteristic of passing light of n different spectral bands, and the transmitted sub-bands and the main band have corresponding relationships, for example, the waveform diagram of the transmitted main band shown in fig. 7 shows that the wavelength of the 1 st main band a waveform is approximately within 350-900 nm, the wavelength of the 2 nd main band B waveform is approximately within 550-1100 nm, each pixel filter unit of the pixel filter 131 shown in fig. 9 can transmit two bands, and has two peaks, and the wavelength of the 1 st sub-band a is approximately within 350-390 nm, and is within the 1 st main band a; the wavelength of the 2 nd sub-band B is within 720-760 nm, and is within the 2 nd main band B, and the i th sub-bands of any two pixel filtering units are different, which means that the wavelengths of the sub-bands transmitted by the pixel filtering units of different colors are different.
Further, in combination with m types of pixel filtering units in the pixel filter, m × n wavelength bands may be determined, in combination with 2 types of main wavelength bands and 9 types of pixel filtering units shown in fig. 7-9, and each type of pixel filtering unit can transmit two types of sub-wavelength bands, so that a waveform diagram of the total transmitted wavelength band shown in fig. 10 may be obtained, including 18 wavelength bands, and when the waveform is a waveform of the 1 st main wavelength band a, an image of the 1 st sub-wavelength band corresponding to each pixel filtering unit, that is, the image of the first 9 wavelength bands may be output; when the waveform is a 2 nd dominant B waveform, an image of the 2 nd sub-band, that is, the latter 9 bands, corresponding to each pixel filtering unit may be output, and based on this, light intensity information of 18 spectral bands may be obtained by the present exemplary embodiment.
In an exemplary embodiment, each of the pixel filtering units described above has an independent control circuit switch, which can be used to control the operating state of each of the pixel filtering units.
Wherein, operating condition can refer to whether pixel light filtering unit can see through corresponding sub-band, for example when including red, green, blue three kinds of pixel light filtering unit, control red pixel light filtering unit operating condition and close, green, blue pixel light filtering unit opens, then can only see through the sub-band that green and blue pixel unit passed through to realize the custom adjustment to the output wave band, thereby obtain the light intensity information of final diversified spectrum section.
In an exemplary embodiment, the pixel filtering unit is coated with a color-changeable film layer. In the present exemplary embodiment, the pixel filtering unit may be made to have two or more kinds of sub-bands by coating it.
Further, the pixel filter is formed by repeatedly arranging a filter array, and the filter array comprises m pixel filter units.
The filter array refers to a unit array constituting a pixel filter, for example, in the pixel filter 131 shown in fig. 4, the filter array is composed of three kinds of pixel filter units of red, green, and blue, and in the pixel filter 621 shown in fig. 8, the filter array is composed of 9 kinds of pixel filter units.
It should be noted that the pixel filter units in the filter array may be the same or different, that is, the filter array may include repeated pixel filter units, for example, the repeated green pixel filter unit in fig. 4, or may include non-repeated pixel filter units, for example, 9 pixel filter units in the pixel filter shown in fig. 8 are not repeated. In other words, when there are k pixel filter units in the filter array, the number m of the types of the pixel filter units may be less than or equal to k, which is not specifically limited by the present disclosure.
In particular, in order to enable the image sensor to output more images corresponding to the sub-bands, in an exemplary embodiment, the filter array may be configured as m pixel filter units, i.e., such that m is equal to k.
In an exemplary embodiment, the imaging device 600 may further include an injector 630, which may be located at the front end of the optical path of the beam splitter 610 or between the beam splitter 610 and the image detector 620. Fig. 11 shows a schematic structural diagram of the imaging device when the injector 630 is located at the front end of the optical path of the optical splitter 610, and when the injector is located at the front end of the optical path of the optical splitter, the received divergent light beam can be contracted to some extent, so that when the light beam reaches the optical splitter, the incident angle is not too large, thereby avoiding the influence on the imaging effect.
In an exemplary embodiment, the above-mentioned injector may include an objective lens and an entrance slit. The entrance slit can be used for isolating the light beam outside the slit in the filtering process so as to ensure that the incident light beam is a narrow light beam.
In an exemplary embodiment, the beam splitter may be located on a housing of the imaging device.
In practical applications, the imaging device is usually configured in a terminal device, such as a mobile phone, a tablet computer, and the like, and in order to make hardware of the terminal device more simplified, the beam splitter may be disposed on a housing of the imaging device, for example, the front filter is disposed on a rear cover of the mobile phone, so that the mobile phone may be thinner, and a better use experience is provided for a user.
In summary, in the present exemplary embodiment, an imaging device is provided, which includes a beam splitter and an image detector; the optical splitter is provided with n main wave bands and is used for separating a light beam of one main wave band from the light beam to project the light beam to the image detector, and n is a positive integer greater than or equal to 2; the image detector comprises a pixel filter and an image sensor; the pixel filter is formed by arranging m pixel filter units, wherein m is a positive integer greater than or equal to 2; each pixel filtering unit has n sub-bands, wherein the ith sub-band is within the ith main band of the n main bands, the ith sub-bands of any two pixel filtering units are different, and i is any positive integer within [1, n ]; the pixel filter is used for filtering the light beam of one main waveband into light beams of m sub-wavebands to project the light beams to the image sensor; the image sensor is used for outputting m sub-wave band images corresponding to the m sub-wave band light beams. In one aspect, the present exemplary embodiment provides a new imaging apparatus, which uses a splitter having n main bands and a pixel filtering unit having n sub-bands, so that when a pixel filter has m pixel filtering unit arrangements, m sub-band images can be obtained without configuring a plurality of filters or cameras, thereby simplifying a hardware structure; on the other hand, the imaging device avoids the problems of poor image overlapping precision and poor imaging quality caused by difficulty in aligning the shot images with the shot objects when a plurality of filters or cameras are arranged; on the other hand, based on the imaging device, images of more spectral bands can be obtained only by switching the main wave band and the sub wave band, the imaging process is simple and convenient, and the application range is wide.
As will be appreciated by one skilled in the art, aspects of the present disclosure may be embodied as a system, method or program product. Accordingly, various aspects of the present disclosure may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
Exemplary embodiments of the present disclosure also provide a program product for implementing the above method, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present disclosure is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the following claims.
Claims (12)
1. An imaging device, comprising a beam splitter and an image detector;
the optical splitter is provided with n main wave bands and is used for separating a light beam of one main wave band from the light beam to project the light beam to the image detector, and n is a positive integer greater than or equal to 2;
the image detector comprises a pixel optical filter and an image sensor; the pixel filter is formed by arranging m pixel filter units, wherein m is a positive integer greater than or equal to 2; each pixel filtering unit has n sub-bands, wherein the ith sub-band is within the ith main band of the n main bands, the ith sub-bands of any two pixel filtering units are different, and i is any positive integer within [1, n ];
the pixel filter is used for filtering the light beam of the main wave band into light beams of m sub wave bands so as to project the light beams to the image sensor;
the image sensor is used for outputting m sub-wave band images corresponding to the m sub-wave band light beams.
2. The imaging apparatus according to claim 1, wherein the beam splitter includes n splitting states corresponding to the n main bands, respectively;
when the optical splitter is in a jth light splitting state, a jth main wave band light beam is separated from the light beam; j is any positive integer in [1, n ].
3. The imaging apparatus of claim 2, wherein the beam splitter comprises a pre-filter.
4. The imaging device of claim 3, wherein the pre-filter comprises a Fabry-Perot interference layer and electrodes on both sides of the Fabry-Perot interference layer; the n splitting states are n different voltages applied to the electrodes, and are used for enabling the Fabry-Perot interference layer to filter light beams in different main wave bands.
5. The imaging device of claim 1, wherein the pixel filter unit is coated with a color-changeable film layer.
6. The imaging device of claim 1, wherein the pixel filter is repeatedly arranged by a filter array including the m kinds of pixel filter units.
7. The imaging device of claim 6, wherein the filter array comprises m pixel filter cells.
8. The imaging apparatus of claim 1, further comprising an injector located at an optical path front end of the beam splitter or between the beam splitter and the image detector.
9. The imaging apparatus of claim 8, wherein the injector comprises an objective lens and an entrance slit.
10. The imaging apparatus of claim 1, wherein the beam splitter is located on a housing of the imaging apparatus.
11. The imaging apparatus of claim 1, wherein each pixel filter unit has an independent control circuit switch for controlling the operating state of each pixel filter unit.
12. An electronic device characterized by comprising the imaging apparatus according to any one of claims 1 to 11.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011323886.5A CN112532832B (en) | 2020-11-23 | 2020-11-23 | Imaging device and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011323886.5A CN112532832B (en) | 2020-11-23 | 2020-11-23 | Imaging device and electronic equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112532832A true CN112532832A (en) | 2021-03-19 |
CN112532832B CN112532832B (en) | 2022-04-12 |
Family
ID=74992788
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011323886.5A Active CN112532832B (en) | 2020-11-23 | 2020-11-23 | Imaging device and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112532832B (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113784043A (en) * | 2021-08-26 | 2021-12-10 | 昆山丘钛微电子科技股份有限公司 | Camera module control circuit, camera module control method, camera module and electronic equipment |
CN115242949A (en) * | 2022-07-21 | 2022-10-25 | Oppo广东移动通信有限公司 | Camera module and electronic equipment |
WO2022233055A1 (en) * | 2021-05-07 | 2022-11-10 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Camera system, method for skin characteristics analysis and non-transitory computer-readable medium storing program instructions for implementing the method |
Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070247615A1 (en) * | 2006-04-21 | 2007-10-25 | Faro Technologies, Inc. | Camera based six degree-of-freedom target measuring and target tracking device with rotatable mirror |
US20090027518A1 (en) * | 2007-07-24 | 2009-01-29 | Casio Computer Co., Ltd. | Image pick-up apparatus and method of controlling the image pick-up apparatus |
CN102116673A (en) * | 2011-01-27 | 2011-07-06 | 北京空间机电研究所 | Catadioptric hybrid multispectral imaging system |
CN103091258A (en) * | 2013-01-29 | 2013-05-08 | 中国科学院光电研究院 | Multispectral imager based on liquid zooming technology |
US20130188023A1 (en) * | 2012-01-23 | 2013-07-25 | Omnivision Technologies, Inc. | Image sensor with optical filters having alternating polarization for 3d imaging |
CN103234527A (en) * | 2013-04-07 | 2013-08-07 | 南京理工大学 | Multispectral light-field camera |
CN105157835A (en) * | 2015-09-15 | 2015-12-16 | 中国科学院光电研究院 | Snapshot-type multispectral image multiple-splitting spectral imaging method and spectral imager |
US20160061661A1 (en) * | 2014-08-29 | 2016-03-03 | Seiko Epson Corporation | Spectroscopic image acquiring apparatus and spectroscopic image acquiring method |
CN107121191A (en) * | 2016-12-23 | 2017-09-01 | 中国电子科技集团公司信息科学研究院 | A kind of self-adapting tuning infrared multispectral detects micro-system |
CN108844630A (en) * | 2018-04-16 | 2018-11-20 | Oppo广东移动通信有限公司 | Imaging device, control method, electronic device, storage medium and computer equipment |
CN108965665A (en) * | 2018-07-19 | 2018-12-07 | 维沃移动通信有限公司 | A kind of imaging sensor and mobile terminal |
CN111272687A (en) * | 2020-03-27 | 2020-06-12 | 东北大学 | Hazardous gas real-time detection device based on infrared multispectral imaging |
CN111458051A (en) * | 2020-03-09 | 2020-07-28 | 西安电子科技大学 | Three-dimensional temperature field measuring system and method based on pixel-level spectral photodetector |
CN211481355U (en) * | 2020-03-11 | 2020-09-11 | Oppo广东移动通信有限公司 | Multispectral sensing structure, sensor and camera |
-
2020
- 2020-11-23 CN CN202011323886.5A patent/CN112532832B/en active Active
Patent Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070247615A1 (en) * | 2006-04-21 | 2007-10-25 | Faro Technologies, Inc. | Camera based six degree-of-freedom target measuring and target tracking device with rotatable mirror |
US20090027518A1 (en) * | 2007-07-24 | 2009-01-29 | Casio Computer Co., Ltd. | Image pick-up apparatus and method of controlling the image pick-up apparatus |
CN102116673A (en) * | 2011-01-27 | 2011-07-06 | 北京空间机电研究所 | Catadioptric hybrid multispectral imaging system |
US20130188023A1 (en) * | 2012-01-23 | 2013-07-25 | Omnivision Technologies, Inc. | Image sensor with optical filters having alternating polarization for 3d imaging |
CN103091258A (en) * | 2013-01-29 | 2013-05-08 | 中国科学院光电研究院 | Multispectral imager based on liquid zooming technology |
CN103234527A (en) * | 2013-04-07 | 2013-08-07 | 南京理工大学 | Multispectral light-field camera |
US20160061661A1 (en) * | 2014-08-29 | 2016-03-03 | Seiko Epson Corporation | Spectroscopic image acquiring apparatus and spectroscopic image acquiring method |
CN105157835A (en) * | 2015-09-15 | 2015-12-16 | 中国科学院光电研究院 | Snapshot-type multispectral image multiple-splitting spectral imaging method and spectral imager |
CN107121191A (en) * | 2016-12-23 | 2017-09-01 | 中国电子科技集团公司信息科学研究院 | A kind of self-adapting tuning infrared multispectral detects micro-system |
CN108844630A (en) * | 2018-04-16 | 2018-11-20 | Oppo广东移动通信有限公司 | Imaging device, control method, electronic device, storage medium and computer equipment |
CN108965665A (en) * | 2018-07-19 | 2018-12-07 | 维沃移动通信有限公司 | A kind of imaging sensor and mobile terminal |
CN111458051A (en) * | 2020-03-09 | 2020-07-28 | 西安电子科技大学 | Three-dimensional temperature field measuring system and method based on pixel-level spectral photodetector |
CN211481355U (en) * | 2020-03-11 | 2020-09-11 | Oppo广东移动通信有限公司 | Multispectral sensing structure, sensor and camera |
CN111272687A (en) * | 2020-03-27 | 2020-06-12 | 东北大学 | Hazardous gas real-time detection device based on infrared multispectral imaging |
Non-Patent Citations (2)
Title |
---|
冯姗等: "四通道可见光光谱相机的设计", 《应用光学》 * |
曹丛峰等: "基于滤光片阵列分光的无人机载多光谱相机研制", 《光学技术》 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022233055A1 (en) * | 2021-05-07 | 2022-11-10 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Camera system, method for skin characteristics analysis and non-transitory computer-readable medium storing program instructions for implementing the method |
CN113784043A (en) * | 2021-08-26 | 2021-12-10 | 昆山丘钛微电子科技股份有限公司 | Camera module control circuit, camera module control method, camera module and electronic equipment |
CN115242949A (en) * | 2022-07-21 | 2022-10-25 | Oppo广东移动通信有限公司 | Camera module and electronic equipment |
Also Published As
Publication number | Publication date |
---|---|
CN112532832B (en) | 2022-04-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112532832B (en) | Imaging device and electronic equipment | |
CN111179282B (en) | Image processing method, image processing device, storage medium and electronic apparatus | |
CN107018298B (en) | Imaging apparatus, electronic apparatus, and method for obtaining image by the same | |
CN112954251B (en) | Video processing method, video processing device, storage medium and electronic equipment | |
CN111953899B (en) | Image generation method, image generation device, storage medium, and electronic apparatus | |
CN113810598A (en) | Photographing method and device | |
CN111741303B (en) | Deep video processing method and device, storage medium and electronic equipment | |
US20220214539A1 (en) | Camera Module and Terminal Device | |
CN113473013A (en) | Display method and device for beautifying effect of image and terminal equipment | |
CN110764249B (en) | Image sensor, camera module and terminal equipment | |
EP4206780A1 (en) | Lens assembly and electronic device comprising same | |
US20230308530A1 (en) | Data Transmission Method and Electronic Device | |
EP4227905A1 (en) | Method and apparatus for eliminating interference pattern from image | |
CN112217996B (en) | Image processing method, image processing apparatus, storage medium, and electronic device | |
CN113409205B (en) | Image processing method, image processing device, storage medium and electronic apparatus | |
CN111245551B (en) | Signal processing method, signal processing device, mobile terminal and storage medium | |
CN111294905B (en) | Image processing method, image processing device, storage medium and electronic apparatus | |
EP4249979A1 (en) | Lens assembly and electronic device comprising same | |
CN117440194A (en) | Method and related device for processing screen throwing picture | |
CN115412678A (en) | Exposure processing method and device and electronic equipment | |
CN111626931A (en) | Image processing method, image processing apparatus, storage medium, and electronic device | |
US20130277788A1 (en) | Imaging unit and imaging device | |
CN111626929B (en) | Depth image generation method and device, computer readable medium and electronic equipment | |
CN113740997A (en) | Lens assembly, camera module and electronic equipment | |
CN117639820B (en) | Wi-Fi device and radio frequency control method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |