CN117609646A - Scene rendering method and device, electronic equipment and storage medium - Google Patents
Scene rendering method and device, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN117609646A CN117609646A CN202311587356.5A CN202311587356A CN117609646A CN 117609646 A CN117609646 A CN 117609646A CN 202311587356 A CN202311587356 A CN 202311587356A CN 117609646 A CN117609646 A CN 117609646A
- Authority
- CN
- China
- Prior art keywords
- scene
- data
- thread
- canvas
- camera
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 95
- 238000009877 rendering Methods 0.000 title claims abstract description 93
- 238000012546 transfer Methods 0.000 claims abstract description 3
- 238000012544 monitoring process Methods 0.000 claims description 12
- 238000004590 computer program Methods 0.000 claims description 7
- 230000000903 blocking effect Effects 0.000 abstract description 9
- 238000004364 calculation method Methods 0.000 abstract description 4
- 238000012545 processing Methods 0.000 description 10
- 238000010586 diagram Methods 0.000 description 7
- 238000006243 chemical reaction Methods 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 230000003993 interaction Effects 0.000 description 4
- 238000010276 construction Methods 0.000 description 3
- 238000007726 management method Methods 0.000 description 3
- 230000006978 adaptation Effects 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 239000013307 optical fiber Substances 0.000 description 2
- 230000000644 propagated effect Effects 0.000 description 2
- 208000003028 Stuttering Diseases 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 238000012917 library technology Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 230000003252 repetitive effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000013515 script Methods 0.000 description 1
- 230000011664 signaling Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/95—Retrieval from the web
- G06F16/957—Browsing optimisation, e.g. caching or content distillation
- G06F16/9574—Browsing optimisation, e.g. caching or content distillation of access to content, e.g. by caching
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/005—General purpose rendering architectures
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D10/00—Energy efficient computing, e.g. low power processors, power management or thermal management
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Databases & Information Systems (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- General Engineering & Computer Science (AREA)
- Computer Graphics (AREA)
- Processing Or Creating Images (AREA)
Abstract
The disclosure provides a scene rendering method, a scene rendering device, electronic equipment and a storage medium, and relates to the technical field of software. The method comprises the following steps: acquiring scene data to be rendered through a main thread of a browser; creating a worker thread and canvas in the main thread, and converting the canvas into off-screen canvas data in an object format; the off-screen canvas data and the scene data are sent to a worker thread through a main thread; and rendering the three-dimensional scene corresponding to the scene data according to the off-screen canvas data through the worker thread. The method for rendering the three-dimensional scene by the worker thread according to the off-screen canvas data and the scene data realizes the transfer of the three-dimensional rendering task with large calculation amount to the worker thread, avoids the three-dimensional rendering task in the main thread, reduces browser blocking and page botnet, and improves user experience.
Description
Technical Field
The disclosure relates to the technical field of software, and in particular relates to a scene rendering method, a scene rendering device, electronic equipment and a storage medium.
Background
With the development of software technology, the demand for three-dimensional scene display increases dramatically, and the three-dimensional scene display is not separated from the rendering of the three-dimensional scene. While Three-dimensional scenes are typically developed using WebGL (Web Graphics Library, a 3D drawing protocol) third party library technology written in Three js (a programming language) in combination with JavaScript language.
In the related art, a three-dimensional scene is rendered by using a main thread of a browser, and after a user adjusts the viewing angle, the position and the scaling of the three-dimensional scene, the main thread is updated and rendered again.
However, rendering the three-dimensional scene takes up the main thread, and rendering the three-dimensional scene consumes a lot of time and performance, causing the main thread to jam and the subsequent task to jam, thereby causing the page to jam.
It should be noted that the information disclosed in the above background section is only for enhancing understanding of the background of the present disclosure and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The disclosure provides a scene rendering method, a scene rendering device, electronic equipment and a storage medium, which at least overcome the problem of page jam caused by rendering a three-dimensional scene in a browser page in the related art to a certain extent.
Other features and advantages of the present disclosure will be apparent from the following detailed description, or may be learned in part by the practice of the disclosure.
According to one aspect of the present disclosure, there is provided a scene rendering method including: acquiring scene data to be rendered through a main thread of a browser; creating a worker thread and canvas in the main thread, and converting the canvas into off-screen canvas data in an object format; sending the off-screen canvas data and the scene data to the worker thread through the main thread; and rendering the three-dimensional scene corresponding to the scene data according to the off-screen canvas data through the worker thread.
In one embodiment of the present disclosure, the rendering, by the worker thread, the three-dimensional scene corresponding to the scene data according to the off-screen canvas data includes: analyzing the off-screen canvas data through the worker thread to obtain canvas parameters; creating an off-screen canvas corresponding to the canvas parameter in the worker thread; creating an initialized three-dimensional scene in the off-screen canvas through the worker thread; rendering the scene data in the initialized three-dimensional scene through the worker thread to obtain the three-dimensional scene.
In one embodiment of the present disclosure, the converting the canvas into off-screen canvas data in an object format includes: and (3) calling an offcanvas=canvas.value.transfer control ToOffscreen () method through the main thread, and converting the canvas into off-screen canvas data in an object format.
In one embodiment of the present disclosure, the sending, by the main thread, the off-screen canvas data and the scene data to the worker thread includes: calling a worker/postmessage () method through the main thread, and sending the off-screen canvas data and the scene data to the worker thread; before the three-dimensional scene corresponding to the scene data is rendered according to the off-screen canvas data by the worker thread, the method further comprises the following steps: and calling a worker.onmessage () method through the worker thread to receive off-screen canvas data and the scene data.
In one embodiment of the present disclosure, the three-dimensional scene is a scene under a first camera, the method further comprising: converting the parameters of the first camera into first camera parameters of an object format in the worker thread, and sending the first camera parameters to the main thread; constructing a second camera corresponding to the first camera parameter in the main thread;
constructing a track controller orbitcontrol corresponding to the second camera in the main thread; acquiring an adjustment operation of the three-dimensional scene in the main thread through the orbitcontrol to obtain a third camera after the second camera is adjusted; in the main thread, converting the parameters of the third camera into second camera parameters of an object format, and sending the second camera parameters to the worker thread; and updating the three-dimensional scene by the worker thread according to the second camera parameter.
In one embodiment of the present disclosure, further comprising: configuring a monitoring method in the main thread by using a worker.onmessage () method; and calling the monitoring method in the main thread to receive the first camera parameter.
In one embodiment of the disclosure, the updating, by the worker thread, the three-dimensional scene according to the second camera parameter includes: the second camera parameter is analyzed by calling an updateCamera () method through the worker thread, and the first camera is updated according to the analyzed data to obtain a fourth camera; and re-rendering the scene data by the worker thread according to the fourth camera to obtain an updated three-dimensional scene.
According to another aspect of the present disclosure, there is provided a scene rendering device including: the acquisition module is used for acquiring scene data to be rendered through a main thread of the browser; the creating and converting module is used for creating a worker thread and canvas in the main thread and converting the canvas into off-screen canvas data in an object format; the sending module is used for the worker thread management module and is also used for sending the off-screen canvas data and the scene data to the worker thread through the main thread; and the rendering module is used for rendering the three-dimensional scene corresponding to the scene data according to the off-screen canvas data through the worker thread.
In one embodiment of the disclosure, the rendering module is configured to parse the off-screen canvas data through the worker thread to obtain canvas parameters; creating an off-screen canvas corresponding to the canvas parameter in the worker thread; creating an initialized three-dimensional scene in the off-screen canvas through the worker thread; rendering the scene data in the initialized three-dimensional scene through the worker thread to obtain the three-dimensional scene.
In one embodiment of the disclosure, the creating and converting module is configured to convert an offCanvas into off-screen canvas data in an object format by calling an offcanvas=canvas.
In one embodiment of the present disclosure, the sending module is configured to call a worker_postmessage () method through the main thread, and send the off-screen canvas data and the scene data to the worker thread; the apparatus further comprises: and the receiving module is used for receiving off-screen canvas data and the scene data by calling a worker.onmessage () method through the worker thread.
In one embodiment of the present disclosure, the three-dimensional scene is a scene under a first camera, the apparatus further comprising: the conversion and sending module is used for converting the parameters of the first camera into first camera parameters of an object format in the worker thread and sending the first camera parameters to the main thread; the construction module is used for constructing a second camera corresponding to the first camera parameter in the main thread; the construction module is used for constructing a track controller orbitcontrol corresponding to the second camera in the main thread; the recording and adjusting module is used for acquiring the adjustment operation of the three-dimensional scene through the orbitcontrol in the main thread to obtain a third camera after the second camera is adjusted; the conversion and sending module is further configured to convert, in the main thread, parameters of the third camera into second camera parameters in an object format, and send the second camera parameters to the worker thread; the rendering module is further configured to update the three-dimensional scene according to the second camera parameter through the worker thread.
In one embodiment of the present disclosure, the apparatus further comprises: a configuration module, configured to configure a listening method in the main thread using a worker.onmessage () method; and calling the monitoring method in the main thread to receive the first camera parameter.
In one embodiment of the disclosure, the rendering module is configured to parse the second camera parameter by calling an updateCamera () method through the worker thread, and update the first camera according to data obtained by parsing to obtain a fourth camera; and re-rendering the scene data by the worker thread according to the fourth camera to obtain an updated three-dimensional scene.
According to still another aspect of the present disclosure, there is provided an electronic apparatus including: a processor; and a memory for storing executable instructions of the processor; wherein the processor is configured to perform any of the above described scene rendering methods via execution of the executable instructions.
According to yet another aspect of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements any of the above-described scene rendering methods.
According to yet another aspect of the present disclosure, there is provided a computer program product comprising a computer program or computer instructions loaded and executed by a processor to cause a computer to implement any of the above described scene rendering methods.
The technical scheme provided by the embodiment of the disclosure at least comprises the following beneficial effects:
according to the technical scheme provided by the embodiment of the disclosure, the worker thread and the canvas are created in the main thread, the off-screen canvas data and the scene data of the canvas are sent to the worker thread, and the three-dimensional rendering task with large calculation amount is transferred to the worker thread in a mode that the worker thread renders the three-dimensional scene according to the off-screen canvas data and the scene data, so that the three-dimensional rendering task in the main thread is avoided, browser blocking and page stuttering are reduced, and user experience is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure. It will be apparent to those of ordinary skill in the art that the drawings in the following description are merely examples of the disclosure and that other drawings may be derived from them without undue effort.
FIG. 1 illustrates a flow diagram of a scene rendering method in one embodiment of the disclosure;
FIG. 2 illustrates a flow chart of a scene rendering method in another embodiment of the present disclosure;
FIG. 3 illustrates a scene rendering signaling diagram in one embodiment of the present disclosure;
FIG. 4 illustrates a schematic diagram of a scene rendering device in one embodiment of the disclosure;
FIG. 5 shows a schematic diagram of a scene rendering device in another embodiment of the disclosure;
fig. 6 shows a block diagram of an electronic device in one embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. However, the exemplary embodiments may be embodied in many forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus a repetitive description thereof will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in software or in one or more hardware modules or integrated circuits or in different networks and/or processor devices and/or microcontroller devices.
It should be understood that the various steps recited in the method embodiments of the present disclosure may be performed in a different order and/or performed in parallel. Furthermore, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
It should be noted that the terms "first," "second," and the like in this disclosure are merely used to distinguish between different devices, modules, or units and are not used to define an order or interdependence of functions performed by the devices, modules, or units.
It should be noted that references to "one", "a plurality" and "a plurality" in this disclosure are intended to be illustrative rather than limiting, and those of ordinary skill in the art will appreciate that "one or more" is intended to be understood as "one or more" unless the context clearly indicates otherwise.
For ease of understanding, the following first explains the several terms involved in this disclosure as follows:
JavaScript: abbreviated as "JS", is a lightweight, interpreted or just-in-time compiled programming language with functional preference. One feature of JavaScript language is that it is single-threaded, i.e. can only handle one task at a time, and as a browser scripting language, javaScript is mainly used for interaction with a user, which also determines that it can only be single-threaded, otherwise it can bring complex synchronization problems.
Threejs: a WebGL engine, based on JavaScript, can directly run GPU (Graphics Processing Unit, graphics processor) to drive games and graphics drivers for application to a browser. The properties and APIs (Application Programming Interface ) provided by its library can render three-dimensional scenes in a browser.
Canvas (Canvas): the Canvas API is a tag newly added in HTML5 (hypertext 5.0) for generating an image in real time on a web page, and can manipulate image contents, which is a bitmap (bitmap) that can be manipulated in JavaScript.
worker threads: the browser execution environment is single-threaded, and once the time-consuming operation of the main thread occurs, the browser is blocked, and the user clicks on the conditions such as no response and the like. To take advantage of the computing power of the multi-core CPU (Central Processing Unit ), HTML5 proposes the Web workbench standard, allowing JS scripts to create multiple threads, but the child threads are fully controlled by the main thread and must not operate the DOM (Document Object Model ). The new standard does not change the JS single thread nature.
The present disclosure provides a scene rendering method, apparatus, electronic device, and storage medium, where the method may be applied to an electronic device with a browser, including but not limited to a smart phone, tablet computer, laptop, desktop computer, wearable device, augmented reality device, virtual reality device, etc.
The electronic device may also be a server that provides various services, in some embodiments, the server may be an independent physical server, or may be a server cluster or a distributed system formed by a plurality of physical servers, or may be a cloud server that provides cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, CDNs (Content Delivery Network, content delivery networks), and basic cloud computing services such as big data and artificial intelligence platforms.
And loading an operation scene by using a newly added worker thread in the three-dimensional scene based on the Threejs and displayed by the browser, and adopting a mode of creating an off-screen canvas based on the main thread in order to ensure that the canvas can be used in the worker thread. Meanwhile, in order to ensure the operation of a user on a three-dimensional scene, such as zooming, rotating, moving the viewing angle and the like, the three-dimensional scene can be normally displayed. The method for synchronizing the scene camera parameters ensures that user operations can be synchronously rendered in the off-screen canvas, thereby satisfying smooth rendering and interaction of large-scale three-dimensional scenes based on the Threejs technology. The method reduces the pressure on the browser thread, reduces blocking and blocking, and is simple and easy to manage codes. The scene rendering method disclosed by the invention is independent of a frame, can be used in combination with main stream frames such as React (a JavaScript frame for constructing a user interface), vue (a JavaScript frame for constructing a user interface), angular (a Web front-end frame) and the like, and improves the performance. The independent worker thread is used, off-screen canvas rendering is used, and the main thread does not need to pay attention to loading, operation and updating rendering, so that the page operation efficiency and smoothness are ensured, and page blocking is reduced.
The present exemplary embodiment will be described in detail below with reference to the accompanying drawings and examples.
Embodiments of the present disclosure provide a scene rendering method that may be performed by any electronic device with computing processing capabilities.
Fig. 1 illustrates a flowchart of a scene rendering method in one embodiment of the present disclosure, and as illustrated in fig. 1, the scene rendering method provided in the embodiment of the present disclosure includes the following S101 to S104.
S101, acquiring scene data to be rendered through a main thread of a browser.
Embodiments of the present disclosure are not limited with respect to what scene data is specifically in the scene. For example, the scene data may be data corresponding to a Three-dimensional screen in a 3D (Three-dimensional) game, or may be data corresponding to any one of Three-dimensional scenes. For example, the scene data includes data for describing the scene objects, and data for describing information of the position, shape, size, and the like of each scene object.
After the electronic device acquires the scene data, the main thread can directly call the scene data through the storage address. Embodiments of the present disclosure are not limited with respect to how the electronic device obtains the scene data. For example, the scene data is directly created in the electronic device or acquired through a network.
S102, creating a worker thread and a canvas in the main thread, and converting the canvas into off-screen canvas data in an object format.
In one embodiment, creating a worker thread in a main thread may include: and creating a Worker thread by calling a new Worker () method in the main thread.
In one embodiment, converting a canvas into off-screen canvas data in object format includes: the canvas is converted into off-screen canvas data in an object format by calling an offcanvas=canvas.
The off-screen canvas data record parameter information of the canvas, such as the position, the length, the width and the like of the canvas.
S103, the off-screen canvas data and the scene data are sent to the worker thread through the main thread.
In one embodiment, sending off-screen canvas data and scene data to the worker thread by the main thread may include: and calling a worker/postmessage () method by the main thread, and sending the off-screen canvas data and the scene data to the worker thread.
Before the worker thread performs three-dimensional scene rendering by using canvas data and scene data, the worker thread can call a worker.onmessage () method to receive off-screen canvas data and scene data.
S104, rendering the three-dimensional scene corresponding to the scene data according to the off-screen canvas data through the worker thread.
In one embodiment, rendering, by the worker thread, a three-dimensional scene corresponding to the scene data according to the off-screen canvas data may include: obtaining canvas parameters through the workbench thread Jie Xili screen canvas data; creating an off-screen canvas corresponding to the canvas parameter in the worker thread; creating an initialized three-dimensional scene in an off-screen canvas through a worker thread; rendering scene data in the initialized three-dimensional scene through a worker thread to obtain the three-dimensional scene.
After Jie Xili screens of canvas data, the worker thread can get the specific parameters of the canvas, including the position, width, length, etc. An off-screen canvas may then be created from the canvas parameters for rendering the three-dimensional scene in the off-screen canvas.
In one embodiment, creating an initialized three-dimensional scene in an off-screen canvas by a worker thread may include: an initialization three-dimensional scene is created using the init () method in the worker thread.
In one embodiment, creating an initialized three-dimensional scene may include: an initialized scene, light, camera are created.
In one embodiment, creating an initialized camera may include: setting a quadrature camera or a perspective camera; setting different parameters of camera, including: position, length, width, upper boundary, lower boundary, far plane, near plane, view angle, orientation, etc.
In one embodiment, creating an initialized light may include: the position, color, whether shadow exists or not, and shadow resolution of the light source in the scene are set.
In one embodiment, rendering scene data in an initialized three-dimensional scene by a worker thread to obtain a three-dimensional scene may include: and calling a render () function through a worker thread, and rendering scene data in the initialized three-dimensional scene to obtain the three-dimensional scene.
By creating the worker thread and the canvas in the main thread and sending the off-screen canvas data and the scene data of the canvas to the worker thread, the three-dimensional rendering task with large calculation amount is transferred to the worker thread in a mode that the worker thread renders the three-dimensional scene according to the off-screen canvas data and the scene data, the three-dimensional rendering task in the main thread is avoided, browser blocking and page Keyton are reduced, and user experience is improved.
In one embodiment of the present disclosure, after the rendering of the scene is completed, the rendering of the scene may also be updated by another scene rendering method as shown in fig. 2 after performing operations such as scaling, rotation, moving angles, etc. on the three-dimensional scene. As shown in fig. 2, the method includes the following S201 to S206.
S201, in the worker thread, converting the parameters of the first camera into the first camera parameters of the object format, and sending the first camera parameters to the main thread.
In one embodiment, sending the first camera parameter to the main thread may include: and calling a worker. Postmessage () method by the worker thread, and sending the first camera parameter to the main thread.
It should be noted that, the worker_postmessage () method may also be used to trigger the main thread to call the createcamer () method, and further create the second camera corresponding to the first camera parameter by using the createcamer () method.
The three-dimensional scene is a scene rendered under the first camera. The first camera is an initialized camera created in the initialized three-dimensional scene.
S202, constructing a second camera corresponding to the first camera parameter in the main thread.
In one embodiment, before the second camera corresponding to the first camera parameter is built in the main thread, the method further includes: configuring a monitoring method in a main thread by using a worker.onmessage () method; and calling a monitoring method in the main thread to receive the first camera parameter.
In one embodiment, constructing a second camera corresponding to the first camera parameter in the main thread may include: and creating a second camera corresponding to the first camera parameter by calling a createCamera () method in the main thread.
S203, constructing an orbitcontrol corresponding to the second camera in the main thread.
Among them, orbitControls (track controllers) are used to provide interface operations such as zooming, rotating, moving viewing angles, etc. for adjusting three-dimensional scenes, thereby recording the adjustment operations.
In one embodiment, constructing the orbitcontrol corresponding to the second camera in the main thread may include: and calling a new OrbitControls () method in the main thread to construct an orbitcontrol corresponding to the second camera.
S204, acquiring an adjustment operation of the three-dimensional scene through orbitcontrol in the main thread to obtain a third camera after adjustment of the second camera.
After interface operation is provided by the orbitcontrol, the adjustment operation on the three-dimensional scene is recorded by the main thread, and the second camera in the main thread can be updated and adjusted according to the record, so that an adjusted third camera is obtained.
In one embodiment, the adjustment operation may be an operator performing some zoom, rotate, move view, etc. of the three-dimensional scene. For example, the operator may input the adjustment operation through an adjustment operation input port provided in the three-dimensional scene, and the input port may input the adjustment operation through a mouse, a keyboard, a sliding screen, or the like.
S205, in the main thread, converting the parameters of the third camera into second camera parameters of an object format, and sending the second camera parameters to the worker thread.
In one embodiment, sending the second camera parameter to the worker thread may include: and calling a worker/postmessage () method by the main thread to send the second camera parameter to the worker thread.
The specific content included in the second camera parameter may be referred to the description of creating the initialized camera in the embodiment corresponding to fig. 1, which is not described herein.
S206, updating the three-dimensional scene according to the second camera parameters through the worker thread.
In one embodiment, updating, by the worker thread, the three-dimensional scene according to the second camera parameter may include: calling an updateCamera () method through a worker thread to analyze the second camera parameters, and updating the first camera according to the data obtained by analysis to obtain a fourth camera; and re-rendering the scene data through the worker thread according to the fourth camera to obtain an updated three-dimensional scene.
And updating the first camera in the worker thread, obtaining a fourth camera after adjusting the first camera, and then re-rendering scene data according to the fourth camera and other initialized three-dimensional scenes to obtain a re-rendered three-dimensional scene.
According to the method, after the three-dimensional scene is rendered by the worker thread, the main thread is used for receiving and recording the adjustment operation of the three-dimensional scene, the adjustment operation is transmitted to the worker thread through updated camera parameters (second camera parameters), the worker thread adjusts the first camera according to the second camera parameters to obtain a fourth camera, and then the scene data is rendered again based on the camera, so that the three-dimensional scene subjected to adjustment operation processing is obtained. By the method, the situation that the three-dimensional scene is re-rendered in the main thread is avoided, the situation that the main thread is needed to be occupied for rendering due to jump adjustment operation is avoided, page smoothness is improved, page blocking is reduced, and user experience is improved.
To facilitate understanding of the scene rendering method provided by the embodiments of the present disclosure, the following will describe with reference to fig. 3. As shown in fig. 3, the process of scene rendering may include S301 to S308. The specific implementation of S301 to S308 may be referred to in the embodiments corresponding to fig. 1 and fig. 2.
S301, creating a worker thread, creating canvas and setting a monitoring method in the main thread.
S302, the main thread sends off-screen canvas data to the worker thread.
S303, the worker thread creates an initialization scene, light and a camera according to the received off-screen canvas data, and creates and renders the off-screen canvas.
S304, the worker thread sends camera parameters to the main thread.
S305, the main thread sets the camera in the main thread according to the received camera parameters, and creates a track controller.
S306, the main thread monitors adjustment operation on the three-dimensional scene.
S307, the main thread sends the adjusted camera parameters to the worker thread.
And S308, the worker thread re-renders the three-dimensional scene according to the received camera parameters.
By creating the worker thread and the canvas in the main thread and sending the off-screen canvas data and the scene data of the canvas to the worker thread, the three-dimensional rendering task with large calculation amount is transferred to the worker thread by the mode that the worker thread renders the three-dimensional scene according to the off-screen canvas data and the scene data, the three-dimensional rendering task in the main thread is avoided, and browser blocking and page Keyton are reduced.
And receiving and recording the adjustment operation of the three-dimensional scene through the main thread, transmitting the adjustment operation to the worker thread through the updated camera parameters (second camera parameters), adjusting the first camera by the worker thread according to the second camera parameters to obtain a fourth camera, and rendering scene data again based on the camera, so that the three-dimensional scene processed by the adjustment operation is obtained. By the method, the situation that the three-dimensional scene is re-rendered in the main thread is avoided, the situation that the main thread is needed to be occupied for rendering due to jump adjustment operation is avoided, page smoothness is improved, page blocking is reduced, and user experience is improved.
Based on the same inventive concept, a scene rendering device is also provided in the embodiments of the present disclosure, as described in the following embodiments. Since the principle of solving the problem of the embodiment of the device is similar to that of the embodiment of the method, the implementation of the embodiment of the device can be referred to the implementation of the embodiment of the method, and the repetition is omitted.
Fig. 4 shows a schematic diagram of a scene rendering device in one embodiment of the disclosure, as shown in fig. 4, the device comprising: an obtaining module 401, configured to obtain scene data to be rendered through a main thread of a browser; the creating and converting module 402 is configured to create a worker thread and canvas in the main thread, and convert the canvas into off-screen canvas data in an object format; the sending module 403 is configured to send, by using a main thread, off-screen canvas data and scene data to a worker thread; and the rendering module 404 is configured to render, by using a worker thread, a three-dimensional scene corresponding to the scene data according to the off-screen canvas data.
In one embodiment of the present disclosure, the rendering module 404 is configured to obtain canvas parameters through a worker thread Jie Xili screen canvas data; creating an off-screen canvas corresponding to the canvas parameter in the worker thread; creating an initialized three-dimensional scene in an off-screen canvas through a worker thread; rendering scene data in the initialized three-dimensional scene through a worker thread to obtain the three-dimensional scene.
In one embodiment of the present disclosure, the creating and converting module 402 is configured to convert a canvas into off-screen canvas data in an object format by calling an offcanvas=canvas.
In one embodiment of the present disclosure, the sending module 403 is configured to send off-screen canvas data and scene data to a worker thread by calling a worker. The apparatus further comprises: and the receiving module is used for receiving the off-screen canvas data and the scene data by calling a worker.onmessage () method through a worker thread.
In one embodiment of the present disclosure, the three-dimensional scene is a scene under a first camera, the apparatus further comprising: the conversion and sending module 405 is configured to convert, in a worker thread, parameters of a first camera into first camera parameters in an object format, and send the first camera parameters to a main thread; a building module 406, configured to build a second camera corresponding to the first camera parameter in the main thread; a construction module 407, configured to construct an orbitcontrol corresponding to the second camera in the main thread; the recording and adjusting module 408 is configured to obtain an adjustment operation on the three-dimensional scene through orbitcontrol in the main thread, and obtain a third camera after adjustment on the second camera; the conversion and sending module 402 is further configured to convert, in the main thread, parameters of the third camera into second camera parameters in the object format, and send the second camera parameters to the worker thread; the rendering module 404 is further configured to update the three-dimensional scene according to the second camera parameter by using a worker thread.
In one embodiment of the present disclosure, the apparatus further comprises: a configuration module 409, configured to configure a listening method in a main thread using a worker.onmessage () method; and calling a monitoring method in the main thread to receive the first camera parameter.
In one embodiment of the present disclosure, the rendering module 404 is configured to parse the second camera parameter by calling an updateCamera () method through a worker thread, and update the first camera according to the parsed data to obtain a fourth camera; and re-rendering the scene data through the worker thread according to the fourth camera to obtain an updated three-dimensional scene.
Based on the same inventive concept, another scene rendering device is also provided in the embodiments of the present disclosure, as described in the following embodiments. Since the principle of solving the problem of the embodiment of the device is similar to that of the embodiment of the method, the implementation of the embodiment of the device can be referred to the implementation of the embodiment of the method, and the repetition is omitted.
Fig. 5 shows a schematic view of a scene rendering device in another embodiment of the disclosure, as shown in fig. 5, the device comprising: a worker thread management module 501, an off-screen canvas initialization module 502, an interaction monitor module 503, and an off-screen canvas update rendering module 504.
The worker thread management module 501 is configured to be responsible for maintaining worker threads, and includes: newly creating a worker thread; a monitoring method is arranged in the main thread and used for monitoring data returned by the worker thread; and sending data to the worker thread, wherein the data comprises off-screen canvas data and camera parameters of the main thread after zooming and rotating the scene.
An off-screen canvas initialization module 502, configured to create an off-screen canvas in a worker thread during initialization; the worker thread receives and analyzes scene data transmitted by the off-screen canvas data main thread, and reads parameters such as the length, the width and the like of canvas in the data; and initializing and creating scenes, lights and cameras, and starting canvas rendering.
And the interaction monitoring module 503 is configured to monitor the operation of the user on the three-dimensional scene and send the operation to the worker thread.
The off-screen canvas update rendering module 504 is used in the worker thread to adjust the position, orientation, scaling and other parameters of the worker thread camera according to the received new camera parameters in the main thread; setting a camera in a worker thread according to the new camera parameters; updating a projection matrix of the camera; and simultaneously re-rendering the off-screen canvas to finish synchronous updating rendering of the adjustment operation.
Those skilled in the art will appreciate that the various aspects of the present disclosure may be implemented as a system, method, or program product. Accordingly, various aspects of the disclosure may be embodied in the following forms, namely: an entirely hardware embodiment, an entirely software embodiment (including firmware, micro-code, etc.) or an embodiment combining hardware and software aspects may be referred to herein as a "circuit," module "or" system.
An electronic device 600 according to such an embodiment of the present disclosure is described below with reference to fig. 6. The electronic device 600 shown in fig. 6 is merely an example and should not be construed to limit the functionality and scope of use of embodiments of the present disclosure in any way.
As shown in fig. 6, the electronic device 600 is in the form of a general purpose computing device. Components of electronic device 600 may include, but are not limited to: the at least one processing unit 610, the at least one memory unit 620, and a bus 630 that connects the various system components, including the memory unit 620 and the processing unit 610.
Wherein the storage unit stores program code that is executable by the processing unit 610 such that the processing unit 610 performs steps according to various exemplary embodiments of the present disclosure described in the section "detailed description" above of the present specification.
The storage unit 620 may include readable media in the form of volatile storage units, such as Random Access Memory (RAM) 6201 and/or cache memory unit 6202, and may further include Read Only Memory (ROM) 6203.
The storage unit 620 may also include a program/utility 6204 having a set (at least one) of program modules 6205, such program modules 6205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment.
Bus 630 may be a local bus representing one or more of several types of bus structures including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or using any of a variety of bus architectures.
The electronic device 600 may also communicate with one or more external devices 640 (e.g., keyboard, pointing device, bluetooth device, etc.), one or more devices that enable a user to interact with the electronic device 600, and/or any device (e.g., router, modem, etc.) that enables the electronic device 600 to communicate with one or more other computing devices. Such communication may occur through an input/output (I/O) interface 650. Also, electronic device 600 may communicate with one or more networks such as a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the Internet, through network adapter 660. As shown in fig. 6, network adapter 660 communicates with other modules of electronic device 600 over bus 630. It should be appreciated that although not shown, other hardware and/or software modules may be used in connection with electronic device 600, including, but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, data backup storage systems, and the like.
From the above description of embodiments, those skilled in the art will readily appreciate that the example embodiments described herein may be implemented in software, or may be implemented in software in combination with the necessary hardware. Thus, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (may be a CD-ROM, a U-disk, a mobile hard disk, etc.) or on a network, including several instructions to cause a computing device (may be a personal computer, a server, a terminal device, or a network device, etc.) to perform the method according to the embodiments of the present disclosure.
In an exemplary embodiment of the present disclosure, a computer-readable storage medium, which may be a readable signal medium or a readable storage medium, is also provided. On which a program product is stored which enables the implementation of the method described above of the present disclosure. In some possible implementations, various aspects of the disclosure may also be implemented in the form of a program product comprising program code for causing a terminal device to carry out the steps according to the various exemplary embodiments of the disclosure as described in the section "detailed description" above of the disclosure, when the program product is run on the terminal device.
More specific examples of the computer readable storage medium in the present disclosure may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
In this disclosure, a computer readable storage medium may include a data signal propagated in baseband or as part of a carrier wave, with readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Alternatively, the program code embodied on a computer readable storage medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
In particular implementations, the program code for carrying out operations of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device, partly on a remote computing device, or entirely on the remote computing device or server. In the case of remote computing devices, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., connected via the Internet using an Internet service provider).
In an exemplary embodiment of the present disclosure, there is also provided a computer program product including a computer program or computer instructions loaded and executed by a processor to cause the computer to carry out the steps according to the various exemplary embodiments of the present disclosure described in the section "detailed description" above.
It should be noted that although in the above detailed description several modules or units of a device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit in accordance with embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into a plurality of modules or units to be embodied.
Furthermore, although the steps of the methods in the present disclosure are depicted in a particular order in the drawings, this does not require or imply that the steps must be performed in that particular order or that all illustrated steps be performed in order to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step to perform, and/or one step decomposed into multiple steps to perform, etc.
From the description of the above embodiments, those skilled in the art will readily appreciate that the example embodiments described herein may be implemented in software, or may be implemented in software in combination with the necessary hardware. Thus, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (may be a CD-ROM, a U-disk, a mobile hard disk, etc.) or on a network, including several instructions to cause a computing device (may be a personal computer, a server, a mobile terminal, or a network device, etc.) to perform the method according to the embodiments of the present disclosure.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any adaptations, uses, or adaptations of the disclosure following the general principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope of the disclosure being indicated by the following claims.
Claims (10)
1. A method of scene rendering, comprising:
acquiring scene data to be rendered through a main thread of a browser;
creating a worker thread and canvas in the main thread, and converting the canvas into off-screen canvas data in an object format;
sending the off-screen canvas data and the scene data to the worker thread through the main thread;
and rendering the three-dimensional scene corresponding to the scene data according to the off-screen canvas data through the worker thread.
2. The method according to claim 1, wherein the rendering, by the worker thread, the three-dimensional scene corresponding to the scene data according to the off-screen canvas data comprises:
Analyzing the off-screen canvas data through the worker thread to obtain canvas parameters;
creating an off-screen canvas corresponding to the canvas parameter in the worker thread;
creating an initialized three-dimensional scene in the off-screen canvas through the worker thread;
rendering the scene data in the initialized three-dimensional scene through the worker thread to obtain the three-dimensional scene.
3. The method of claim 1, wherein the converting the canvas to off-screen canvas data in object format comprises:
and (3) calling an offcanvas=canvas.value.transfer control ToOffscreen () method through the main thread, and converting the canvas into off-screen canvas data in an object format.
4. The method of claim 1, wherein said sending the off-screen canvas data and the scene data to the worker thread by the main thread comprises:
calling a worker/postmessage () method through the main thread, and sending the off-screen canvas data and the scene data to the worker thread;
before the three-dimensional scene corresponding to the scene data is rendered according to the off-screen canvas data by the worker thread, the method further comprises the following steps:
And calling a worker.onmessage () method through the worker thread to receive off-screen canvas data and the scene data.
5. The method of claim 1, wherein the three-dimensional scene is a scene under a first camera, the method further comprising:
converting the parameters of the first camera into first camera parameters of an object format in the worker thread, and sending the first camera parameters to the main thread;
constructing a second camera corresponding to the first camera parameter in the main thread;
constructing a track controller orbitcontrol corresponding to the second camera in the main thread;
acquiring an adjustment operation of the three-dimensional scene in the main thread through the orbitcontrol to obtain a third camera after the second camera is adjusted;
in the main thread, converting the parameters of the third camera into second camera parameters of an object format, and sending the second camera parameters to the worker thread;
and updating the three-dimensional scene by the worker thread according to the second camera parameter.
6. The method as recited in claim 5, further comprising:
configuring a monitoring method in the main thread by using a worker.onmessage () method;
And calling the monitoring method in the main thread to receive the first camera parameter.
7. The method of claim 5, wherein the updating, by the worker thread, the three-dimensional scene according to the second camera parameter comprises:
the second camera parameter is analyzed by calling an updateCamera () method through the worker thread, and the first camera is updated according to the analyzed data to obtain a fourth camera;
and re-rendering the scene data by the worker thread according to the fourth camera to obtain an updated three-dimensional scene.
8. A scene rendering device, comprising:
the acquisition module is used for acquiring scene data to be rendered through a main thread of the browser;
the creating and converting module is used for creating a worker thread and canvas in the main thread and converting the canvas into off-screen canvas data in an object format;
the sending module is used for the worker thread management module and is also used for sending the off-screen canvas data and the scene data to the worker thread through the main thread;
and the rendering module is used for rendering the three-dimensional scene corresponding to the scene data according to the off-screen canvas data through the worker thread.
9. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the scene rendering method of any of claims 1-7 via execution of the executable instructions.
10. A computer readable storage medium having stored thereon a computer program, characterized in that the computer program, when executed by a processor, implements the scene rendering method according to any of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311587356.5A CN117609646A (en) | 2023-11-24 | 2023-11-24 | Scene rendering method and device, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311587356.5A CN117609646A (en) | 2023-11-24 | 2023-11-24 | Scene rendering method and device, electronic equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN117609646A true CN117609646A (en) | 2024-02-27 |
Family
ID=89954365
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311587356.5A Pending CN117609646A (en) | 2023-11-24 | 2023-11-24 | Scene rendering method and device, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117609646A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118247403A (en) * | 2024-03-31 | 2024-06-25 | 司空定制家居科技有限公司 | Rendering method and device applied to browser, electronic equipment and storage medium |
CN118317143A (en) * | 2024-06-05 | 2024-07-09 | 北京蓝色星河软件技术发展有限公司 | Video fusion method and device, electronic equipment and storage medium |
-
2023
- 2023-11-24 CN CN202311587356.5A patent/CN117609646A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118247403A (en) * | 2024-03-31 | 2024-06-25 | 司空定制家居科技有限公司 | Rendering method and device applied to browser, electronic equipment and storage medium |
CN118317143A (en) * | 2024-06-05 | 2024-07-09 | 北京蓝色星河软件技术发展有限公司 | Video fusion method and device, electronic equipment and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107832108B (en) | Rendering method and device of 3D canvas webpage elements and electronic equipment | |
US10110936B2 (en) | Web-based live broadcast | |
CN117609646A (en) | Scene rendering method and device, electronic equipment and storage medium | |
US11089349B2 (en) | Apparatus and method for playing back and seeking media in web browser | |
CN111818120A (en) | End cloud user interaction method and system, corresponding equipment and storage medium | |
CN113784049B (en) | Camera calling method of android system virtual machine, electronic equipment and storage medium | |
BR112021009629A2 (en) | method of processing user interface content, system, and non-transient computer readable media | |
CN112307403B (en) | Page rendering method and device, storage medium and terminal | |
KR20210045371A (en) | Method and device for image rendering | |
CN115550687A (en) | Three-dimensional model scene interaction method, system, equipment, device and storage medium | |
KR20180086112A (en) | Apparatus and method for playing back and seeking media in web browser | |
CN108335342B (en) | Method, apparatus and computer program product for multi-person drawing on a web browser | |
CN114222185B (en) | Video playing method, terminal equipment and storage medium | |
CN112997220A (en) | System and method for visualization and interaction of 3D models via remotely rendered video streams | |
CN116055540B (en) | Virtual content display system, method, apparatus and computer readable medium | |
CN115243102B (en) | Video playing method, device, equipment and storage medium based on Web technology | |
US11758016B2 (en) | Hosted application as web widget toolkit | |
CN114866801B (en) | Video data processing method, device, equipment and computer readable storage medium | |
CN115587272A (en) | Front-end page color adjusting method, device, system and medium | |
CN114035903A (en) | Method and system for realizing Linux KVM virtual machine to support 3D application | |
CN113836455A (en) | Special effect rendering method, device, equipment, storage medium and computer program product | |
CN113835816A (en) | Virtual machine desktop display method, device, equipment and readable storage medium | |
CN115393490A (en) | Image rendering method and device, storage medium and electronic equipment | |
CN115022678B (en) | Image processing method, system, device, equipment and storage medium | |
CN116389801B (en) | Multi-terminal integrated cloud streaming client implementation method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |