CN111563950B - Texture mapping strategy determination method, device and computer readable storage medium - Google Patents

Texture mapping strategy determination method, device and computer readable storage medium Download PDF

Info

Publication number
CN111563950B
CN111563950B CN202010379743.XA CN202010379743A CN111563950B CN 111563950 B CN111563950 B CN 111563950B CN 202010379743 A CN202010379743 A CN 202010379743A CN 111563950 B CN111563950 B CN 111563950B
Authority
CN
China
Prior art keywords
depth
area
depth value
determining
projection area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010379743.XA
Other languages
Chinese (zh)
Other versions
CN111563950A (en
Inventor
程谟方
赵靖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
You Can See Beijing Technology Co ltd AS
Original Assignee
You Can See Beijing Technology Co ltd AS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by You Can See Beijing Technology Co ltd AS filed Critical You Can See Beijing Technology Co ltd AS
Priority to CN202010379743.XA priority Critical patent/CN111563950B/en
Publication of CN111563950A publication Critical patent/CN111563950A/en
Application granted granted Critical
Publication of CN111563950B publication Critical patent/CN111563950B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Generation (AREA)

Abstract

Embodiments of the present disclosure disclose a texture mapping policy determination method, apparatus, and computer-readable storage medium. The method comprises the following steps: acquiring a three-dimensional model, N shooting images of the same scene corresponding to the three-dimensional model, and N Zhang Shendu diagrams corresponding to the N shooting images; wherein, the shooting parameters of any two shooting images in the N shooting images are different; selecting a depth map for triangular patch projection on the three-dimensional model from the N depth maps; determining a projection area of the triangular patch on the selected depth map; acquiring an effective area occupation ratio and a depth value error of a projection area; according to the area of the projection area, the effective area duty ratio and the depth value error, determining energy equation data items related to the triangular surface patch and the shooting images corresponding to the selected depth map; and determining a texture mapping strategy between the three-dimensional model and the N photographed images according to the energy equation data item. The embodiment of the disclosure can improve the texture mapping precision and the texture mapping effect.

Description

Texture mapping strategy determination method, device and computer readable storage medium
Technical Field
The present disclosure relates to the field of texture mapping technologies, and in particular, to a method and apparatus for determining a texture mapping policy, and a computer readable storage medium.
Background
In some cases, in order to improve the display effect of the three-dimensional model, texture mapping between the three-dimensional model corresponding to the same scene and a plurality of photographed images may be performed. To implement texture mapping, a specific texture mapping strategy needs to be determined first, that is, it is determined which of a plurality of captured images each triangular patch on the three-dimensional model is mapped with. Currently, the manner of determining texture mapping policies is: and projecting the triangular surface patch onto a photographed image, integrating the gradient amplitude in a projection area on the photographed image to obtain energy equation data items related to the triangular surface patch and the photographed image, and solving an energy equation after obtaining a large number of energy equation data items to obtain a final texture mapping strategy.
However, according to practical experience, the above texture mapping strategy is often not suitable, and a texture mapping error is easily generated, for example, a case in which the texture of the stairs is mapped to the wall by mistake as shown in the block in fig. 1, and thus, the texture mapping effect in the related art is poor.
Disclosure of Invention
The present disclosure has been made in order to solve the above technical problems. Embodiments of the present disclosure provide a texture mapping policy determination method, apparatus, and computer-readable storage medium.
According to an aspect of an embodiment of the present disclosure, there is provided a texture mapping policy determining method, including:
acquiring a three-dimensional model, N shooting images of the same scene corresponding to the three-dimensional model, and N Zhang Shendu diagrams corresponding to the N shooting images; wherein N is an integer greater than or equal to 2, and the shooting parameters of any two shooting images in the N shooting images are different;
selecting a depth map for triangular patch projection on the three-dimensional model from the N depth maps;
determining a projection area of the triangular patch on the selected depth map;
acquiring an effective area occupation ratio and a depth value error of the projection area;
according to the area of the projection area, the effective area duty ratio and the depth value error, determining energy equation data items associated with the triangular patches and the shooting images corresponding to the selected depth map;
and determining a texture mapping strategy between the three-dimensional model and the N photographed images according to the energy equation data item.
In one example of an alternative implementation of the method,
the energy equation data item is positively correlated with the area of the projection region;
the energy equation data item is positively correlated with the effective area duty cycle;
the energy equation data item is inversely related to the depth value error.
In an optional example, the determining the energy equation data item associated with the captured image corresponding to the triangular patch and the selected depth map according to the area of the projection area, the effective area duty ratio and the depth value error includes:
calculating the sum of the depth value error and a preset numerical value; wherein the preset value is a non-zero value;
calculating the product of the area of the projection area and the effective area duty ratio;
calculating a ratio of the product to the sum;
and taking the calculated ratio as an energy equation data item associated with the triangular surface patch and the shot image corresponding to the selected depth map.
In an alternative example, the acquiring the effective area duty ratio and the depth value error of the projection area includes:
determining a pixel point in the projection area, wherein the pixel point is correspondingly recorded with a measured depth value in the selected depth map;
Performing depth value interpolation processing on the pixel points in the projection area according to the real depth value of the vertex of the triangular patch in the three-dimensional model to obtain an interpolation processing result;
obtaining the determined true depth value of the pixel point from the interpolation processing result;
and calculating the depth value error of the projection area according to the determined true depth value and the measured depth value of the pixel point.
In an alternative example, the acquiring the effective area duty ratio and the depth value error of the projection area includes:
determining a total number of pixel points in the projection area;
determining the number of pixel points corresponding to the measured depth value in the selected depth map in the projection area;
calculating a ratio of the number to the total number;
the calculated ratio is taken as the effective area duty ratio of the projection area.
In an optional example, the selecting a depth map for triangular patch projection on the three-dimensional model from the N depth maps includes:
selecting a shooting image from the N shooting images for a triangular patch on the three-dimensional model; wherein the triangular patch is visible corresponding to the selected captured image;
Determining a depth map corresponding to the selected shooting image in the N depth maps;
and taking the determined depth map as the depth map selected for the triangular patch.
According to another aspect of the embodiments of the present disclosure, there is provided a texture mapping policy determining apparatus, including:
the first acquisition module is used for acquiring a three-dimensional model, N shooting images of the same scene corresponding to the three-dimensional model and N Zhang Shendu pictures corresponding to the N shooting images; wherein N is an integer greater than or equal to 2, and the shooting parameters of any two shooting images in the N shooting images are different;
the selection module is used for selecting a depth map for triangular patch projection on the three-dimensional model from the N depth maps;
the first determining module is used for determining a projection area of the triangular patch on the selected depth map;
the second acquisition module is used for acquiring the effective area duty ratio and depth value error of the projection area;
the second determining module is used for determining energy equation data items associated with the triangular patches and the shooting images corresponding to the selected depth map according to the area of the projection area, the effective area duty ratio and the depth value error;
And the third determining module is used for determining a texture mapping strategy between the three-dimensional model and the N photographed images according to the energy equation data item.
In one example of an alternative implementation of the method,
the energy equation data item is positively correlated with the area of the projection region;
the energy equation data item is positively correlated with the effective area duty cycle;
the energy equation data item is inversely related to the depth value error.
In an alternative example, the second determining module includes:
the first calculating unit is used for calculating the sum value of the depth value error and a preset numerical value; wherein the preset value is a non-zero value;
a second calculation unit for calculating a product of an area of the projection region and the effective region duty ratio;
a third calculation unit for calculating a ratio of the product to the sum;
and the first determining unit is used for taking the calculated ratio as an energy equation data item associated with the shooting images corresponding to the triangular surface patches and the selected depth map.
In an alternative example, the second acquisition module includes:
a second determining unit, configured to determine, in the projection area, a pixel point in the selected depth map, where a measured depth value is recorded correspondingly;
The first acquisition unit is used for carrying out depth value interpolation processing on the pixel points in the projection area according to the real depth values of the vertexes of the triangular patches in the three-dimensional model so as to obtain an interpolation processing result;
the second acquisition unit is used for acquiring the determined real depth value of the pixel point from the interpolation processing result;
and a fourth calculation unit, configured to calculate a depth value error of the projection area according to the determined true depth value and the measured depth value of the pixel point.
In an alternative example, the second acquisition module includes:
a third determining unit configured to determine a total number of pixel points in the projection area;
a fourth determining unit, configured to determine, in the projection area, the number of pixels in the selected depth map, where the measured depth value is recorded correspondingly;
a fifth calculation unit for calculating a ratio of the number to the total number;
and a fifth determining unit, configured to take the calculated ratio as an effective area duty ratio of the projection area.
In an alternative example, the selection module includes:
a selection unit, configured to select a shot image for a triangular patch on the three-dimensional model from the N shot images; wherein the triangular patch is visible corresponding to the selected captured image;
A sixth determining unit, configured to determine a depth map corresponding to the selected captured image from the N depth maps;
and a seventh determining unit, configured to use the determined depth map as the depth map selected for the triangular patch.
According to still another aspect of the embodiments of the present disclosure, there is provided a computer-readable storage medium storing a computer program for performing the above-described texture mapping policy determination method.
According to still another aspect of the embodiments of the present disclosure, there is provided an electronic device including:
a processor;
a memory for storing the processor-executable instructions;
the processor is configured to read the executable instructions from the memory and execute the instructions to implement the texture mapping policy determination method described above.
In an embodiment of the disclosure, in order to implement texture mapping, a three-dimensional model, N photographed images of the same scene corresponding to the three-dimensional model, and N Zhang Shendu maps corresponding to the N photographed images may be acquired, a triangular patch on the three-dimensional model is projected onto a depth map corresponding to an appropriate photographed image, so as to determine a corresponding projection area, and a corresponding energy equation data item is determined according to an area of the projection area, an effective area duty ratio, and a depth value error. Because the area of the projection area is associated with the resolution of the image, the effective area occupation ratio can embody the degree of the image affected by strong light, the depth value error can embody the suitability of the triangular surface patch and the shot image corresponding to the projected depth image, the energy equation data items determined by the area of the projection area, the effective area occupation ratio and the depth value error can be utilized to effectively predict the mapping effect (such as whether shielding exists, whether shielding is serious, whether resolution is high or not and the like) when the triangular surface patch is mapped by the shot image corresponding to the projected depth image, a proper texture mapping strategy can be determined according to the energy equation data items, the image area with higher resolution can be selected for mapping according to the texture mapping strategy, the selection of the strong light area for mapping is avoided, and the image area with wrong depth information is avoided for mapping.
Therefore, in the embodiment of the disclosure, by introducing the area of the projection area, the effective area duty ratio and the depth value error, the energy equation data item is determined, so that the energy equation data item is more reasonable, the reliability of the determined texture mapping strategy is ensured, the texture mapping precision and the texture mapping effect can be effectively improved, and even if the three-dimensional model is inaccurate (such as the case of model missing or depth missing), the embodiment of the disclosure can achieve a good texture mapping effect.
The technical scheme of the present disclosure is described in further detail below through the accompanying drawings and examples.
Drawings
The above and other objects, features and advantages of the present disclosure will become more apparent by describing embodiments thereof in more detail with reference to the accompanying drawings. The accompanying drawings are included to provide a further understanding of embodiments of the disclosure, and are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and together with the description serve to explain the disclosure, without limitation to the disclosure. In the drawings, like reference numerals generally refer to like parts or steps.
Fig. 1 is a schematic diagram of texture mapping effects in the related art.
Fig. 2 is a flow chart illustrating a method for determining texture mapping policy according to an exemplary embodiment of the present disclosure.
Fig. 3 is a schematic diagram of texture mapping effects in an embodiment of the present disclosure.
FIG. 4 is a schematic diagram of a graph for constructing an energy equation.
Fig. 5 is a schematic diagram of a texture mapping strategy.
FIG. 6 is a flow chart of determining energy equation data items in an exemplary embodiment of the present disclosure.
Fig. 7 is a schematic structural diagram of a texture mapping policy determining apparatus according to an exemplary embodiment of the present disclosure.
Fig. 8 is a block diagram of an electronic device provided in an exemplary embodiment of the present disclosure.
Detailed Description
Hereinafter, example embodiments according to the present disclosure will be described in detail with reference to the accompanying drawings. It should be apparent that the described embodiments are only some of the embodiments of the present disclosure and not all of the embodiments of the present disclosure, and that the present disclosure is not limited by the example embodiments described herein.
It should be noted that: the relative arrangement of the components and steps, numerical expressions and numerical values set forth in these embodiments do not limit the scope of the present disclosure unless it is specifically stated otherwise.
It will be appreciated by those of skill in the art that the terms "first," "second," etc. in embodiments of the present disclosure are used merely to distinguish between different steps, devices or modules, etc., and do not represent any particular technical meaning nor necessarily logical order between them.
It should also be understood that in embodiments of the present disclosure, "plurality" may refer to two or more, and "at least one" may refer to one, two or more.
It should also be appreciated that any component, data, or structure referred to in the presently disclosed embodiments may be generally understood as one or more without explicit limitation or the contrary in the context.
In addition, the term "and/or" in this disclosure is merely an association relationship describing an association object, and indicates that three relationships may exist, for example, a and/or B may indicate: a exists alone, A and B exist together, and B exists alone. In addition, the character "/" in the present disclosure generally indicates that the front and rear association objects are an or relationship.
It should also be understood that the description of the various embodiments of the present disclosure emphasizes the differences between the various embodiments, and that the same or similar features may be referred to each other, and for brevity, will not be described in detail.
Meanwhile, it should be understood that the sizes of the respective parts shown in the drawings are not drawn in actual scale for convenience of description.
The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the disclosure, its application, or uses.
Techniques, methods, and apparatus known to one of ordinary skill in the relevant art may not be discussed in detail, but are intended to be part of the specification where appropriate.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further discussion thereof is necessary in subsequent figures.
Embodiments of the present disclosure may be applicable to electronic devices such as terminal devices, computer systems, servers, etc., which may operate with numerous other general purpose or special purpose computing system environments or configurations. Examples of well known terminal devices, computing systems, environments, and/or configurations that may be suitable for use with the terminal device, computer system, server, or other electronic device include, but are not limited to: personal computer systems, server computer systems, thin clients, thick clients, hand-held or laptop devices, microprocessor-based systems, set-top boxes, programmable consumer electronics, network personal computers, minicomputer systems, mainframe computer systems, and distributed cloud computing technology environments that include any of the above systems, and the like.
Electronic devices such as terminal devices, computer systems, servers, etc. may be described in the general context of computer system-executable instructions, such as program modules, being executed by a computer system. Generally, program modules may include routines, programs, objects, components, logic, data structures, etc., that perform particular tasks or implement particular abstract data types. The computer system/server may be implemented in a distributed cloud computing environment in which tasks are performed by remote processing devices that are linked through a communications network. In a distributed cloud computing environment, program modules may be located in both local and remote computing system storage media including memory storage devices.
Exemplary method
Fig. 2 is a flow chart illustrating a method for determining texture mapping policy according to an exemplary embodiment of the present disclosure. The method shown in fig. 2 includes step 201, step 202, step 203, step 204, step 205, and step 206, and each step is described below.
Step 201, acquiring a three-dimensional model, N photographed images of the same scene corresponding to the three-dimensional model, and an N Zhang Shendu chart corresponding to the N photographed images; wherein N is an integer greater than or equal to 2, and any two of the N photographed images have different photographing parameters.
Here, N may be 5, 8, 10, 12 or other values, and the photographing parameters may include photographing angle, photographing distance or other parameters, which are not listed here.
In step 201, modeling may be performed by using three-dimensional software to obtain a three-dimensional model, or the depth acquisition device may be used to acquire the real size of the physical space, and the acquired real size may be used in a three-dimensional modeling manner to generate a model of the physical space. The present disclosure does not limit the manner of generating the three-dimensional model. In particular, the three-dimensional model may be the three-dimensional house model shown in fig. 1 or 3.
It should be noted that, the same scene corresponding to the three-dimensional model and the N photographed images may be understood as: in the case where the three-dimensional model is the three-dimensional house model shown in fig. 1 or 3, each of the N photographed images may be an image obtained by performing indoor live-action photographing on a real house corresponding to the three-dimensional house model.
Optionally, each of the N photographed images may be a color image, and each photographed image may be photographed by using a color camera located at a different position in the real house, so as to ensure that photographing angles and/or photographing distances of different photographed images are different; the color image may be an RGB image, R is Red and represents Red, G is Green and represents Green, and B is Blue and represents Blue.
Alternatively, depth cameras corresponding to the color cameras one by one may be configured, and calibration and alignment are performed in advance between each depth camera and the corresponding color camera. When each color camera is used for obtaining a corresponding shooting image, a corresponding depth image can be obtained by using a corresponding depth camera, so that N depth images are obtained.
Step 202, selecting a depth map for triangular patch projection on a three-dimensional model from N depth maps.
It should be noted that there may be a plurality of triangular patches on the three-dimensional model, and the triangular patches mentioned in the embodiments of the present disclosure may be any triangular patches of the plurality of triangular patches.
In step 202, the number of depth maps selected from the N depth maps may be one, two, three, four, etc., which are not listed here.
In step 203, a projected area of the triangular patch on the selected depth map is determined.
Here, the projection area of the triangular patch on the selected depth map may be a triangular area.
In step 203, three-dimensional coordinates of three vertices of the triangular patch may be obtained from the three-dimensional model, and then three-dimensional to two-dimensional coordinate conversion may be performed to determine two-dimensional coordinates (which may also be referred to as projection coordinates) of projection points of the three vertices on the selected depth map. Under the condition that the two-dimensional coordinates of three projection points corresponding to the three vertexes are well determined, the whole projection area can be determined accordingly.
Alternatively, in order to obtain two-dimensional coordinates of three projection points, an internal reference matrix of the depth camera for obtaining the selected depth map may be determined, and the three-dimensional coordinates of any vertex may be multiplied by the obtained internal reference matrix to obtain two-dimensional coordinates of the corresponding projection point.
Step 204, obtaining the effective area duty ratio and depth value error of the projection area.
It should be noted that, due to the influence of the hardware device of the depth camera and the use environment, there is often a depth missing on the depth map obtained by the depth camera, and the depth information (i.e. the depth measurement value in the following) reflected on the depth map may not completely conform to the actual situation, so in step 204, the effective area occupation ratio and the depth value error of the projection area may be obtained. Here, the effective area duty cycle may characterize an area duty cycle in which there is no depth missing in the projection area; the depth value error may represent a difference between the depth information represented on the depth map and an actual situation, and may specifically be a root mean square error of the depth value.
In step 205, energy equation data items associated with the captured images corresponding to the triangular patch and the selected depth map are determined according to the area of the projection region, the effective region duty ratio, and the depth value error.
Here, the area of the projection area may be expressed as project_area, and the area of the projection area may be calculated according to two-dimensional coordinates of three projection points corresponding to three vertices of the triangular patch. In addition, the energy equation data item may be expressed as data_term, and the energy equation may be expressed as energy function.
In one embodiment of the present invention, in one embodiment,
the energy equation data item is positively correlated with the area of the projection region;
the energy equation data item is positively correlated with the effective area duty cycle;
the energy equation data item is inversely related to the depth value error.
Optionally, determining energy equation data items associated with the captured images corresponding to the triangular patch and the selected depth map according to the area of the projection area, the effective area duty ratio and the depth value error, including:
calculating the sum of the depth value error and a preset numerical value; wherein the preset value is a non-zero value;
calculating the product of the area of the projection area and the effective area ratio;
calculating the ratio of the product to the sum;
and taking the calculated ratio as an energy equation data item associated with the photographed images corresponding to the triangular patches and the selected depth maps.
Here, the preset value may be 0.0001, 0.0002, 0.0005 or other non-zero value, which is not listed here.
Assuming that the depth value error is denoted as rmse, the preset value is denoted as delta, the area of the projection region is denoted as project_area, the effective region ratio is denoted as valid_ratio, and the energy equation data item associated with each of the captured images corresponding to the triangular patch and the selected depth map is denoted as data_term, there are:
data_term=project_area*valid_ratio/(delta+rmse)
therefore, the energy equation data item can be conveniently calculated by using the depth value error, the preset numerical value, the projection area and the effective area ratio to perform simple operation, and positive correlation of the energy equation data item and the area of the projection area and the effective area ratio and negative correlation of the energy equation data item and the depth value error are ensured.
Of course, the manner of determining the energy equation data item is not limited to this, and other operations may be performed between the depth value error, the preset value, the projection area, and the effective area ratio, so long as a positive correlation between the energy equation data item and the projection area and a positive correlation between the energy equation data item and the depth value error are ensured.
Step 206, determining a texture mapping strategy between the three-dimensional model and the N photographed images according to the energy equation data item.
It should be noted that, in the case where the number of depth maps for triangular patch projection selected in step 202 is at least two, by executing step 205, at least two energy equation data items corresponding to the at least two depth maps may be obtained. In a similar manner, for each triangular patch on the three-dimensional model, a number of energy equation data items corresponding thereto may be obtained, respectively.
Next, the following fig. 4 may be constructed, and fig. 4 may be represented as g= { V, E }. V represents vertices, which can be divided into two types, namely a triangular patch set (i.e., a set of triangular patches on a three-dimensional model) and a captured image set (i.e., a set of N captured images, each of which can be represented by a label, and the captured image set can also be referred to as a label set); wherein the triangular patch set can be recorded as
Figure BDA0002480831250000101
Figure BDA0002480831250000102
The tag set may be written as
Figure BDA0002480831250000103
Each small rectangle in fig. 4 represents a triangular patch, and each circle in fig. 4 represents a label. E represents edges, which can be divided into two classes, an edge between adjacent triangular patches (which can be denoted as n-link) and an edge between a triangular patch and a label (which can be denoted as t-link). Specifically, the t-link between the tag 1 and the triangular patch 1 in fig. 4 corresponds to a target energy equation data item, which is an energy equation data item associated with the captured images corresponding to both the triangular patch 1 and the tag 1.
In addition, an energy equation corresponding to fig. 4 may be established, and the energy equation may be expressed as:
Figure BDA0002480831250000104
wherein E is data (F i ,I i ) For energy equation data item E smooth (F i ,F j ,I i ,I j ) Is an energy equation smoothing term.
Alternatively, the energy-squares smoothing term may choose a potts model (which is a common variational model of multiphase image segmentation), namely:
Figure BDA0002480831250000105
and then, the energy equation can be minimized according to the image segmentation mode to obtain a texture mapping strategy between the three-dimensional model and the N photographed images, wherein the texture mapping strategy is used for indicating which photographed image in the N photographed images is used for mapping each triangular patch on the three-dimensional model.
Assuming that the texture mapping policy corresponds to fig. 5, the texture mapping policy indicates that: for P 1 The four triangular patches in the (1) are mapped by adopting the shooting images corresponding to the labels; for P 2 One triangular patch in the image is mapped by adopting a shooting image corresponding to the label 2; for P 3 The three triangular patches in the (3) are mapped by adopting the shooting images corresponding to the labels; for P k The triangular surface patch corresponding to the label k is used for mapping the triangular surface patch.
From practical experience, the mapping effect when texture mapping according to the texture mapping strategy in embodiments of the present disclosure may be determined, see in particular fig. 3. It is easy to see that in the block in fig. 3, there is no case that the texture of the staircase is mapped onto the wall by mistake, that is, the embodiment of the present disclosure can improve the texture mapping accuracy and ensure the texture mapping effect.
In an embodiment of the disclosure, in order to implement texture mapping, a three-dimensional model, N photographed images of the same scene corresponding to the three-dimensional model, and N Zhang Shendu maps corresponding to the N photographed images may be acquired, a triangular patch on the three-dimensional model is projected onto a depth map corresponding to an appropriate photographed image, so as to determine a corresponding projection area, and a corresponding energy equation data item is determined according to an area of the projection area, an effective area duty ratio, and a depth value error. Because the area of the projection area is associated with the resolution of the image, the effective area occupation ratio can embody the degree of the image affected by strong light, the depth value error can embody the suitability of the triangular surface patch and the shot image corresponding to the projected depth image, the energy equation data items determined by the area of the projection area, the effective area occupation ratio and the depth value error can be utilized to effectively predict the mapping effect (such as whether shielding exists, whether shielding is serious, whether resolution is high or not and the like) when the triangular surface patch is mapped by the shot image corresponding to the projected depth image, a proper texture mapping strategy can be determined according to the energy equation data items, the image area with higher resolution can be selected for mapping according to the texture mapping strategy, the selection of the strong light area for mapping is avoided, and the image area with wrong depth information is avoided for mapping.
Therefore, in the embodiment of the disclosure, by introducing the area of the projection area, the effective area duty ratio and the depth value error, the energy equation data item is determined, so that the energy equation data item is more reasonable, the reliability of the determined texture mapping strategy is ensured, the texture mapping precision and the texture mapping effect can be effectively improved, and even if the three-dimensional model is inaccurate (such as the case of model missing or depth missing), the embodiment of the disclosure can achieve a good texture mapping effect.
In one alternative example, acquiring the effective area duty cycle and depth value error of the projected area includes:
determining a total number of pixel points in the projection area;
determining the number of pixels in the projection area, corresponding to the measured depth value recorded in the selected depth map;
calculating the ratio of the number to the total number;
the calculated ratio is taken as the effective area duty ratio of the projection area.
After determining the projection area, the total number of pixels in the projection area (S1 is assumed), and the S1 pixels are traversed to determine the number of pixels in the selected depth map in which the measured depth value is recorded (S2 is assumed). Then, the valid_ratio may be calculated using the formula valid_ratio=s2/S1, and the valid_ratio is taken as the effective area ratio.
In the embodiment of the disclosure, by calculating the duty ratio of the number of pixels having the measured depth value in the projection area, the effective area duty ratio can be calculated conveniently and reliably.
Of course, the manner of determining the effective area ratio of the projection area is not limited to this, and for example, in the case where the projection area is obviously divided into two sub-areas, each pixel point in one of the sub-areas has a measured depth value, and each pixel point in the other sub-area does not have a measured depth value, the ratio of the area of the other sub-area to the area of the projection area may be calculated and taken as the effective area ratio.
In one alternative example, acquiring the effective area duty cycle and depth value error of the projected area includes:
determining a pixel point in the projection area, wherein the pixel point is correspondingly recorded with a measured depth value in the selected depth map;
according to the real depth value of the vertex of the triangular patch in the three-dimensional model, performing depth value interpolation processing on the pixel points in the projection area to obtain an interpolation processing result;
obtaining the determined true depth value of the pixel point from the interpolation processing result;
and calculating a depth value error of the projection area according to the determined true depth value and the measured depth value of the pixel point.
Here, after the projection area is determined, the pixel point in the selected depth map where the measured depth value is recorded correspondingly may be determined in the projection area.
Here, after the projection area is determined, the actual depth values of the three vertices of the triangular patch in the three-dimensional model may also be obtained, and the actual depth values of the three vertices may be used as the actual depth values of the corresponding three projection points. In the case that the actual depth values of the three projection points are known, the depth value interpolation processing may be performed by using interpolation algorithms such as bilinear interpolation and cubic interpolation, so as to obtain an interpolation processing result, where the interpolation processing result may include the actual depth value of each pixel point in the projection region.
And then, acquiring the real depth value of the determined pixel point from the interpolation processing result, acquiring the measured depth value of the determined pixel point from the selected depth map, and calculating the depth value error, such as the root mean square error of the depth value under the condition that the real depth values of the determined pixel point are known.
For example, there are 100 pixels in the projection area, only the first 80 pixels in the 100 pixels have measured depth values, and the measured depth values of the 80 pixels are D1, D2, … …, and D80 in sequence, and after performing the depth value interpolation processing, the real measured values of the 80 pixels can be obtained from the interpolation processing result, and the measured depth values of the 80 pixels are D1', D2', … …, and D80' in sequence. It should be noted that, the 80 test depth values D1, D2, … …, D80 and the 80 actual depth values D1', D2', … …, D80' are in a one-to-one correspondence, and the depth value root mean square error rmse, rmse may be calculated by using the 80 test depth values, the 80 actual depth values and the root mean square calculation formula, and may be used as the depth value error of the projection area.
Of course, the depth value error may be other types of errors, such as a depth value root mean square error, and the like, which are not listed here.
In the embodiment of the disclosure, the depth value error of the projection area can be conveniently and reliably calculated by using the measured depth value recorded in the depth map and the real depth value obtained by interpolation according to the real depth value in the three-dimensional model, so as to effectively represent the difference between the depth information reflected on the depth map and the actual situation.
In one optional example, selecting a depth map for triangular patch projection on a three-dimensional model from the N depth maps includes:
selecting a shooting image from the N shooting images for the triangular patches on the three-dimensional model; wherein the triangular patch is visible corresponding to the selected captured image;
determining a depth map corresponding to the selected shooting image in the N depth maps;
and taking the determined depth map as the depth map selected for the triangular patch.
In the embodiment of the disclosure, the internal and external parameters of the color camera for obtaining each photographed image can be obtained. For any shot image, judging whether rays are shielded by an object when rays are emitted from three vertexes of the triangular patch according to the internal and external parameters of the color camera for acquiring the shot image, and if so, judging that the triangular patch is invisible corresponding to the shot image; otherwise, it may be determined that the triangular patch is visible corresponding to the captured image. In this way, a captured image satisfying the visibility requirement can be selected for the triangular patch on the three-dimensional model from among the N captured images. Next, from the N depth maps, a depth map corresponding to the selected captured image may be determined according to the correspondence between the captured image and the depth map, and the determined depth map may be used as a depth map for triangular patch projection.
Generally, only if a triangle patch is visible corresponding to a certain shot image, it is necessary to determine whether it is appropriate to map the triangle patch by using the shot image.
In an alternative example, as shown in fig. 6, a depth map corresponding to a color image one-to-one may be input first. Next, the true depth values of the three vertices of the triangular patch on the three-dimensional model may be obtained, as well as the projection coordinates of the three vertices on the visual depth map (corresponding to the depth map selected for projection above).
Then, the measured depth value of the pixel point in the projection area of the triangular patch on the depth map can be obtained, and the real depth values of the three vertexes are utilized to conduct depth value interpolation processing so as to obtain the real depth value of the pixel point in the projection area, so that the root mean square error of the depth value between the measured depth value and the real depth value in the projection area is calculated and recorded as rmse. In addition, the effective depth area ratio (corresponding to the effective area ratio above) within the projection area may also be recorded, denoted as valid_ratio, and the area of the projection area may be calculated, denoted as project_area.
Then, the data item data_term of the energy equation can be calculated according to data_term=project_area_ratio/(delta+rmse), and the energy equation is solved, so that a final texture mapping strategy is obtained.
It should be noted that, the mapping error is mainly due to a model missing or a depth acquisition missing, which results in that a triangle patch with an erroneous depth on the three-dimensional model is corresponding to the shot picture, so in the embodiment of the disclosure, the design of the energy equation data item introduces an error item of the depth, and the depth error is inversely related to the data item. The size of the triangular patch projection area directly determines the resolution of the selected picture, so in embodiments of the present disclosure, the projected area is positively correlated with the energy equation data item. In addition, due to hardware equipment and scanning environment, depth is missing on the depth map, so the effective depth ratio of the projection area is positively correlated with the energy equation data item. By adopting the mode to design the energy equation data item, the energy equation data item is more reasonable, so that the texture mapping precision and the texture mapping effect can be effectively improved.
Any of the texture mapping policy determination methods provided by embodiments of the present disclosure may be performed by any suitable data processing capable device, including, but not limited to: terminal equipment, servers, etc. Alternatively, any of the texture mapping policy determination methods provided by the embodiments of the present disclosure may be executed by a processor, such as the processor executing any of the texture mapping policy determination methods mentioned by the embodiments of the present disclosure by invoking corresponding instructions stored in a memory. And will not be described in detail below.
Exemplary apparatus
Fig. 7 is a schematic structural diagram of a texture mapping policy determining apparatus according to an exemplary embodiment of the present disclosure. The apparatus shown in fig. 7 includes a first acquisition module 701, a selection module 702, a first determination module 703, a second acquisition module 704, a second determination module 705, and a third determination module 706.
A first obtaining module 701, configured to obtain a three-dimensional model, N captured images of the same scene corresponding to the three-dimensional model, and N Zhang Shendu maps corresponding to the N captured images; wherein N is an integer greater than or equal to 2, and any two of the N photographed images have different photographing parameters;
a selection module 702, configured to select a depth map for triangular patch projection on the three-dimensional model from the N depth maps;
a first determining module 703, configured to determine a projection area of the triangular patch on the selected depth map;
a second obtaining module 704, configured to obtain an effective area duty ratio and a depth value error of the projection area;
the second determining module 705 is configured to determine, according to the area of the projection area, the effective area duty ratio, and the depth value error, energy equation data items associated with the captured images corresponding to the triangular patch and the selected depth map;
A third determining module 706 is configured to determine a texture mapping policy between the three-dimensional model and the N captured images according to the energy equation data item.
In one example of an alternative implementation of the method,
the energy equation data item is positively correlated with the area of the projection region;
the energy equation data item is positively correlated with the effective area duty cycle;
the energy equation data item is inversely related to the depth value error.
In an alternative example, the second determining module 705 includes:
the first calculation unit is used for calculating the sum value of the depth value error and the preset numerical value; wherein the preset value is a non-zero value;
a second calculation unit for calculating the product of the area of the projection area and the effective area ratio;
a third calculation unit for calculating the ratio of the product and the sum;
and the first determining unit is used for taking the calculated ratio as an energy equation data item associated with the photographed images corresponding to the triangular surface patches and the selected depth map.
In an alternative example, the second acquisition module 704 includes:
a second determining unit, configured to determine, in the projection area, a pixel point in the selected depth map, where a measured depth value is recorded correspondingly;
the first acquisition unit is used for carrying out depth value interpolation processing on pixel points in a projection area according to the real depth value of the vertex of the triangular patch in the three-dimensional model so as to obtain an interpolation processing result;
The second acquisition unit is used for acquiring the determined real depth value of the pixel point from the interpolation processing result;
and a fourth calculation unit for calculating a depth value error of the projection area according to the determined true depth value and the measured depth value of the pixel point.
In an alternative example, the second acquisition module 704 includes:
a third determination unit configured to determine a total number of pixel points in the projection area;
a fourth determining unit configured to determine, in the projection area, a number of pixel points in the selected depth map in which the measured depth values are recorded correspondingly;
a fifth calculation unit for calculating a ratio of the number to the total number;
and a fifth determining unit for taking the calculated ratio as the effective area duty ratio of the projection area.
In an alternative example, the selection module 702 includes:
the selection unit is used for selecting a shot image for the triangular patch on the three-dimensional model from the N shot images; wherein the triangular patch is visible corresponding to the selected captured image;
a sixth determining unit configured to determine a depth map corresponding to the selected captured image from among the N depth maps;
and a seventh determining unit, configured to use the determined depth map as the depth map selected for the triangular patch.
Exemplary electronic device
Next, an electronic device according to an embodiment of the present disclosure is described with reference to fig. 8. The electronic device may be either or both of the first device and the second device, or a stand-alone device independent thereof, which may communicate with the first device and the second device to receive the acquired input signals therefrom.
Fig. 8 illustrates a block diagram of an electronic device 80 according to an embodiment of the present disclosure.
As shown in fig. 8, the electronic device 80 includes one or more processors 81 and memory 82.
Processor 81 may be a Central Processing Unit (CPU) or other form of processing unit having data processing and/or instruction execution capabilities and may control other components in electronic device 80 to perform desired functions.
Memory 82 may include one or more computer program products that may include various forms of computer-readable storage media, such as volatile memory and/or non-volatile memory. The volatile memory may include, for example, random Access Memory (RAM) and/or cache memory (cache), and the like. The non-volatile memory may include, for example, read Only Memory (ROM), hard disk, flash memory, and the like. One or more computer program instructions may be stored on the computer readable storage medium that can be executed by the processor 81 to implement the texture mapping policy determination method and/or other desired functions of the various embodiments of the present disclosure described above. Various contents such as an input signal, a signal component, a noise component, and the like may also be stored in the computer-readable storage medium.
In one example, the electronic device 80 may further include: an input device 83 and an output device 84, which are interconnected by a bus system and/or other forms of connection mechanisms (not shown).
For example, where the electronic device 80 is a first device or a second device, the input means 83 may be a microphone or an array of microphones. When the electronic device 80 is a stand-alone device, the input means 83 may be a communication network connector for receiving the acquired input signals from the first device and the second device.
In addition, the input device 83 may also include, for example, a keyboard, a mouse, and the like.
The output device 84 can output various information to the outside. The output device 84 may include, for example, a display, speakers, a printer, and a communication network and remote output devices connected thereto, etc.
Of course, only some of the components of the electronic device 80 relevant to the present disclosure are shown in fig. 8, with components such as buses, input/output interfaces, etc. omitted for simplicity. In addition, the electronic device 80 may include any other suitable components depending on the particular application.
Exemplary computer program product and computer readable storage Medium
In addition to the methods and apparatus described above, embodiments of the present disclosure may also be a computer program product comprising computer program instructions which, when executed by a processor, cause the processor to perform the steps in a texture mapping policy determination method according to various embodiments of the present disclosure described in the above "exemplary methods" section of this specification.
The computer program product may write program code for performing the operations of embodiments of the present disclosure in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device, partly on a remote computing device, or entirely on the remote computing device or server.
Embodiments of the present disclosure may also be a computer-readable storage medium, having stored thereon computer program instructions, which when executed by a processor, cause the processor to perform the steps in a texture mapping policy determination method according to various embodiments of the present disclosure described in the above "exemplary methods" section of the present description.
The computer readable storage medium may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium may include, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium would include the following: an electrical connection having one or more wires, a portable disk, a hard disk, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The basic principles of the present disclosure have been described above in connection with specific embodiments, however, it should be noted that the advantages, benefits, effects, etc. mentioned in the present disclosure are merely examples and not limiting, and these advantages, benefits, effects, etc. are not to be considered as necessarily possessed by the various embodiments of the present disclosure. Furthermore, the specific details disclosed herein are for purposes of illustration and understanding only, and are not intended to be limiting, since the disclosure is not necessarily limited to practice with the specific details described.
In this specification, each embodiment is described in a progressive manner, and each embodiment is mainly described in a different manner from other embodiments, so that the same or similar parts between the embodiments are mutually referred to. For system embodiments, the description is relatively simple as it essentially corresponds to method embodiments, and reference should be made to the description of method embodiments for relevant points.
The block diagrams of the devices, apparatuses, and arrangements referred to in this disclosure are only illustrative examples and are not intended to require or imply that the connections, arrangements, configurations must be made in the manner shown in the block diagrams. As will be appreciated by one of skill in the art, the devices, apparatuses, devices may be connected, arranged, configured in any manner. Words such as "including," "comprising," "having," and the like are words of openness and mean "including but not limited to," and are used interchangeably therewith. The terms "or" and "as used herein refer to and are used interchangeably with the term" and/or "unless the context clearly indicates otherwise.
The methods and apparatus of the present disclosure may be implemented in a number of ways. For example, the methods and apparatus of the present disclosure may be implemented by software, hardware, firmware, or any combination of software, hardware, firmware. The above-described sequence of steps for the method is for illustration only, and the steps of the method of the present disclosure are not limited to the sequence specifically described above unless specifically stated otherwise. Furthermore, in some embodiments, the present disclosure may also be implemented as programs recorded in a recording medium, the programs including machine-readable instructions for implementing the methods according to the present disclosure. Thus, the present disclosure also covers a recording medium storing a program for executing the method according to the present disclosure.
It is also noted that in the apparatus, devices and methods of the present disclosure, components or steps may be disassembled and/or assembled. Such decomposition and/or recombination should be considered equivalent to the present disclosure.
The previous description of the disclosed aspects is provided to enable any person skilled in the art to make or use the present disclosure. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects without departing from the scope of the disclosure. Thus, the present disclosure is not intended to be limited to the aspects shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing description has been presented for purposes of illustration and description. Furthermore, this description is not intended to limit the embodiments of the disclosure to the form disclosed herein. Although a number of example aspects and embodiments have been discussed above, a person of ordinary skill in the art will recognize certain variations, modifications, alterations, additions, and subcombinations thereof.

Claims (14)

1. A method for determining a texture mapping policy, comprising:
acquiring a three-dimensional model, N shooting images of the same scene corresponding to the three-dimensional model, and N Zhang Shendu diagrams corresponding to the N shooting images; wherein N is an integer greater than or equal to 2, and the shooting parameters of any two shooting images in the N shooting images are different;
Selecting a depth map for triangular patch projection on the three-dimensional model from the N depth maps;
determining a projection area of the triangular patch on the selected depth map;
acquiring an effective area occupation ratio and a depth value error of the projection area;
according to the area of the projection area, the effective area duty ratio and the depth value error, determining energy equation data items associated with the triangular patches and the shooting images corresponding to the selected depth map;
and determining a texture mapping strategy between the three-dimensional model and the N photographed images according to the energy equation data item.
2. The method of claim 1, wherein the step of determining the position of the substrate comprises,
the energy equation data item is positively correlated with the area of the projection region;
the energy equation data item is positively correlated with the effective area duty cycle;
the energy equation data item is inversely related to the depth value error.
3. The method of claim 2, wherein determining energy equation data items associated with captured images corresponding to the triangular patch and the selected depth map based on the area of the projection region, the effective region duty cycle, and the depth value error, comprises:
Calculating the sum of the depth value error and a preset numerical value; wherein the preset value is a non-zero value;
calculating the product of the area of the projection area and the effective area duty ratio;
calculating a ratio of the product to the sum;
and taking the calculated ratio as an energy equation data item associated with the triangular surface patch and the shot image corresponding to the selected depth map.
4. The method of claim 1, wherein the obtaining the effective area duty cycle and depth value error for the projection area comprises:
determining a pixel point in the projection area, wherein the pixel point is correspondingly recorded with a measured depth value in the selected depth map;
performing depth value interpolation processing on the pixel points in the projection area according to the real depth value of the vertex of the triangular patch in the three-dimensional model to obtain an interpolation processing result;
obtaining the determined true depth value of the pixel point from the interpolation processing result;
and calculating the depth value error of the projection area according to the determined true depth value and the measured depth value of the pixel point.
5. The method of claim 1, wherein the obtaining the effective area duty cycle and depth value error for the projection area comprises:
Determining a total number of pixel points in the projection area;
determining the number of pixel points corresponding to the measured depth value in the selected depth map in the projection area;
calculating the ratio of the number of the pixel points recorded with the measured depth value to the total number;
the calculated ratio is taken as the effective area duty ratio of the projection area.
6. The method of claim 1, wherein selecting a depth map for triangular patch projection on the three-dimensional model from the N depth maps comprises:
selecting a shooting image from the N shooting images for a triangular patch on the three-dimensional model; wherein the triangular patch is visible corresponding to the selected captured image;
determining a depth map corresponding to the selected shooting image in the N depth maps;
and taking the determined depth map as the depth map selected for the triangular patch.
7. A texture mapping policy determination apparatus, comprising:
the first acquisition module is used for acquiring a three-dimensional model, N shooting images of the same scene corresponding to the three-dimensional model and N Zhang Shendu pictures corresponding to the N shooting images; wherein N is an integer greater than or equal to 2, and the shooting parameters of any two shooting images in the N shooting images are different;
The selection module is used for selecting a depth map for triangular patch projection on the three-dimensional model from the N depth maps;
the first determining module is used for determining a projection area of the triangular patch on the selected depth map;
the second acquisition module is used for acquiring the effective area duty ratio and depth value error of the projection area;
the second determining module is used for determining energy equation data items associated with the triangular patches and the shooting images corresponding to the selected depth map according to the area of the projection area, the effective area duty ratio and the depth value error;
and the third determining module is used for determining a texture mapping strategy between the three-dimensional model and the N photographed images according to the energy equation data item.
8. The apparatus of claim 7, wherein the device comprises a plurality of sensors,
the energy equation data item is positively correlated with the area of the projection region;
the energy equation data item is positively correlated with the effective area duty cycle;
the energy equation data item is inversely related to the depth value error.
9. The apparatus of claim 8, wherein the second determining module comprises:
The first calculating unit is used for calculating the sum value of the depth value error and a preset numerical value; wherein the preset value is a non-zero value;
a second calculation unit for calculating a product of an area of the projection region and the effective region duty ratio;
a third calculation unit for calculating a ratio of the product to the sum;
and the first determining unit is used for taking the calculated ratio as an energy equation data item associated with the shooting images corresponding to the triangular surface patches and the selected depth map.
10. The apparatus of claim 7, wherein the second acquisition module comprises:
a second determining unit, configured to determine, in the projection area, a pixel point in the selected depth map, where a measured depth value is recorded correspondingly;
the first acquisition unit is used for carrying out depth value interpolation processing on the pixel points in the projection area according to the real depth values of the vertexes of the triangular patches in the three-dimensional model so as to obtain an interpolation processing result;
the second acquisition unit is used for acquiring the determined real depth value of the pixel point from the interpolation processing result;
and a fourth calculation unit, configured to calculate a depth value error of the projection area according to the determined true depth value and the measured depth value of the pixel point.
11. The apparatus of claim 7, wherein the second acquisition module comprises:
a third determining unit configured to determine a total number of pixel points in the projection area;
a fourth determining unit, configured to determine, in the projection area, the number of pixels in the selected depth map, where the measured depth value is recorded correspondingly;
a fifth calculation unit for calculating a ratio of the number of the pixel points recorded with the measured depth values to the total number;
and a fifth determining unit, configured to take the calculated ratio as an effective area duty ratio of the projection area.
12. The apparatus of claim 7, wherein the selection module comprises:
a selection unit, configured to select a shot image for a triangular patch on the three-dimensional model from the N shot images; wherein the triangular patch is visible corresponding to the selected captured image;
a sixth determining unit, configured to determine a depth map corresponding to the selected captured image from the N depth maps;
and a seventh determining unit, configured to use the determined depth map as the depth map selected for the triangular patch.
13. A computer readable storage medium storing a computer program for executing the texture mapping policy determination method according to any one of the preceding claims 1 to 6.
14. An electronic device, comprising:
a processor;
a memory for storing the processor-executable instructions;
the processor is configured to read the executable instructions from the memory and execute the instructions to implement the texture mapping policy determination method as claimed in any one of claims 1 to 6.
CN202010379743.XA 2020-05-07 2020-05-07 Texture mapping strategy determination method, device and computer readable storage medium Active CN111563950B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010379743.XA CN111563950B (en) 2020-05-07 2020-05-07 Texture mapping strategy determination method, device and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010379743.XA CN111563950B (en) 2020-05-07 2020-05-07 Texture mapping strategy determination method, device and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN111563950A CN111563950A (en) 2020-08-21
CN111563950B true CN111563950B (en) 2023-04-21

Family

ID=72074583

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010379743.XA Active CN111563950B (en) 2020-05-07 2020-05-07 Texture mapping strategy determination method, device and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN111563950B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112270737A (en) * 2020-11-25 2021-01-26 浙江商汤科技开发有限公司 Texture mapping method and device, electronic equipment and storage medium
CN112884898B (en) * 2021-03-17 2022-06-07 杭州思看科技有限公司 Reference device for measuring texture mapping precision
CN113223149B (en) * 2021-05-08 2024-07-02 中煤(西安)航测遥感研究院有限公司 Three-dimensional model texture generation method, device, equipment and storage medium
CN113126944B (en) * 2021-05-17 2021-11-09 北京的卢深视科技有限公司 Depth map display method, display device, electronic device, and storage medium
CN115205435B (en) * 2022-06-14 2023-06-20 中国科学院深圳先进技术研究院 Improved texture mapping method and device based on Markov random field
CN114757834B (en) * 2022-06-16 2022-09-27 北京建筑大学 Panoramic image processing method and panoramic image processing device
CN117173321B (en) * 2023-11-02 2024-02-23 埃洛克航空科技(北京)有限公司 Method and device for selecting three-dimensional reconstruction texture view

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104574501A (en) * 2014-12-19 2015-04-29 浙江大学 High-quality texture mapping method aiming at complicated three-dimensional scene
CN104732577A (en) * 2015-03-10 2015-06-24 山东科技大学 Building texture extraction method based on UAV low-altitude aerial survey system
CN105261064A (en) * 2015-10-10 2016-01-20 浙江工业大学 Three-dimensional cultural relic reconstruction system and method based on computer stereo vision
CN106204710A (en) * 2016-07-13 2016-12-07 四川大学 The method that texture block based on two-dimensional image comentropy is mapped to three-dimensional grid model
CN108062788A (en) * 2017-12-18 2018-05-22 北京锐安科技有限公司 A kind of three-dimensional rebuilding method, device, equipment and medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6891548B2 (en) * 2002-08-23 2005-05-10 Hewlett-Packard Development Company, L.P. System and method for calculating a texture-mapping gradient
JP4255449B2 (en) * 2005-03-01 2009-04-15 株式会社ソニー・コンピュータエンタテインメント Drawing processing apparatus, texture processing apparatus, and tessellation method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104574501A (en) * 2014-12-19 2015-04-29 浙江大学 High-quality texture mapping method aiming at complicated three-dimensional scene
CN104732577A (en) * 2015-03-10 2015-06-24 山东科技大学 Building texture extraction method based on UAV low-altitude aerial survey system
CN105261064A (en) * 2015-10-10 2016-01-20 浙江工业大学 Three-dimensional cultural relic reconstruction system and method based on computer stereo vision
CN106204710A (en) * 2016-07-13 2016-12-07 四川大学 The method that texture block based on two-dimensional image comentropy is mapped to three-dimensional grid model
CN108062788A (en) * 2017-12-18 2018-05-22 北京锐安科技有限公司 A kind of three-dimensional rebuilding method, device, equipment and medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
姜翰青等.面向复杂三维场景的高质量纹理映射.2015,第第38卷卷(第第12期期),第2349-2360页. *

Also Published As

Publication number Publication date
CN111563950A (en) 2020-08-21

Similar Documents

Publication Publication Date Title
CN111563950B (en) Texture mapping strategy determination method, device and computer readable storage medium
US11625896B2 (en) Face modeling method and apparatus, electronic device and computer-readable medium
US10529086B2 (en) Three-dimensional (3D) reconstructions of dynamic scenes using a reconfigurable hybrid imaging system
Mastin et al. Automatic registration of LIDAR and optical images of urban scenes
US20200058153A1 (en) Methods and Devices for Acquiring 3D Face, and Computer Readable Storage Media
JP6057298B2 (en) Rapid 3D modeling
US10846844B1 (en) Collaborative disparity decomposition
CN114399597B (en) Method and device for constructing scene space model and storage medium
US20190096092A1 (en) Method and device for calibration
WO2013009662A2 (en) Calibration between depth and color sensors for depth cameras
CN112102199B (en) Depth image cavity region filling method, device and system
US20160245641A1 (en) Projection transformations for depth estimation
CN113643414A (en) Three-dimensional image generation method and device, electronic equipment and storage medium
Stommel et al. Inpainting of missing values in the Kinect sensor's depth maps based on background estimates
CN114299156A (en) Method for calibrating and unifying coordinates of multiple cameras in non-overlapping area
JP6579659B2 (en) Light source estimation apparatus and program
CN111742352A (en) 3D object modeling method and related device and computer program product
CN113129346B (en) Depth information acquisition method and device, electronic equipment and storage medium
Ylimäki et al. Accurate 3-d reconstruction with rgb-d cameras using depth map fusion and pose refinement
KR100933304B1 (en) An object information estimator using the single camera, a method thereof, a multimedia device and a computer device including the estimator, and a computer-readable recording medium storing a program for performing the method.
CN114004874B (en) Acquisition method and device of occupied grid map
Ashdown et al. Robust calibration of camera-projector system for multi-planar displays
KR102611481B1 (en) Method and apparatus for calculating actual distance between coordinates in iamge
CN111145268A (en) Video registration method and device
US20240338879A1 (en) Methods, storage media, and systems for selecting a pair of consistent real-world camera poses

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20201028

Address after: 100085 Floor 102-1, Building No. 35, West Second Banner Road, Haidian District, Beijing

Applicant after: Seashell Housing (Beijing) Technology Co.,Ltd.

Address before: 300 457 days Unit 5, Room 1, 112, Room 1, Office Building C, Nangang Industrial Zone, Binhai New Area Economic and Technological Development Zone, Tianjin

Applicant before: BEIKE TECHNOLOGY Co.,Ltd.

TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20220324

Address after: 100085 8th floor, building 1, Hongyuan Shouzhu building, Shangdi 6th Street, Haidian District, Beijing

Applicant after: As you can see (Beijing) Technology Co.,Ltd.

Address before: 100085 Floor 101 102-1, No. 35 Building, No. 2 Hospital, Xierqi West Road, Haidian District, Beijing

Applicant before: Seashell Housing (Beijing) Technology Co.,Ltd.

GR01 Patent grant
GR01 Patent grant
EE01 Entry into force of recordation of patent licensing contract
EE01 Entry into force of recordation of patent licensing contract

Application publication date: 20200821

Assignee: Beijing Intellectual Property Management Co.,Ltd.

Assignor: As you can see (Beijing) Technology Co.,Ltd.

Contract record no.: X2023110000092

Denomination of invention: Method, device, and computer-readable storage medium for determining texture mapping strategies

Granted publication date: 20230421

License type: Common License

Record date: 20230818