CN117315494A - Collaborative concurrency labeling method and system based on regional association - Google Patents

Collaborative concurrency labeling method and system based on regional association Download PDF

Info

Publication number
CN117315494A
CN117315494A CN202311609019.1A CN202311609019A CN117315494A CN 117315494 A CN117315494 A CN 117315494A CN 202311609019 A CN202311609019 A CN 202311609019A CN 117315494 A CN117315494 A CN 117315494A
Authority
CN
China
Prior art keywords
labeling
remote sensing
sensing image
concurrency
collaborative
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311609019.1A
Other languages
Chinese (zh)
Inventor
孙显
陈佳良
张文凯
闫志远
田宇洁
皮誉洋
张东旭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aerospace Information Research Institute of CAS
Original Assignee
Aerospace Information Research Institute of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aerospace Information Research Institute of CAS filed Critical Aerospace Information Research Institute of CAS
Priority to CN202311609019.1A priority Critical patent/CN117315494A/en
Publication of CN117315494A publication Critical patent/CN117315494A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/22Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
    • G06V10/225Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition based on a marking or identifier characterising the area
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • G06V10/267Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention provides a collaborative concurrency labeling method and a collaborative concurrency labeling system based on regional association, relates to the technical field of remote sensing image interpretation, and aims to solve the technical problem that a large-scale remote sensing image is difficult to label in the prior art. The method comprises the following steps: acquiring a remote sensing image to be annotated, and generating a task creation instruction according to the remote sensing image to be annotated; responding to a task creation instruction, dividing a labeling area of a remote sensing image to be labeled, and determining a plurality of labeling personnel with corresponding quantity according to a plurality of labeling areas obtained by dividing; and sharing the labeling information to each labeling person in real time, wherein the labeling information is content of labeling the remote sensing image to be labeled by a plurality of labeling persons by using a labeling tool.

Description

Collaborative concurrency labeling method and system based on regional association
Technical Field
The invention relates to the technical field of remote sensing image interpretation, in particular to a collaborative concurrency labeling method and system based on regional association.
Background
With the appearance of a pre-trained large model, the development threshold of the artificial intelligence is effectively reduced, the problems of complexity, generalizability and the like of application scenes are well solved, the application range of the artificial intelligence is greatly promoted, and new opportunities and challenges are brought to the floor application of the artificial intelligence. However, the parameter quantity of the large model reaches billions or billions, massive data is an important ring which is indispensable for large model training and optimization, and the upper limit of the mass of the large model is determined, so that massive high-quality labeling data becomes a large model construction, training data generates a data label which is indistinct, and quick and high-quality data labeling becomes an important link which is indispensable for large model construction, and is a basic material for large model construction.
At present, aiming at remote sensing image data, training of a sample set is mainly finished by independent manual labeling. The data set has the advantages of multiple labeling types, large size, multiple purposes and high cost of labor, and compared with sample data in a natural scene, high-quality remote sensing sample data is very scarce, so that how to label and establish a large-scale remote sensing image data set is an urgent problem to be solved.
Disclosure of Invention
In view of the above problems, the invention provides a collaborative concurrency labeling method and a collaborative concurrency labeling system based on regional union.
According to a first aspect of the present invention, there is provided a collaborative concurrency labeling method based on region association, including: acquiring a remote sensing image to be annotated, and generating a task creation instruction according to the remote sensing image to be annotated; responding to a task creation instruction, dividing a labeling area of a remote sensing image to be labeled, and determining a plurality of labeling personnel with corresponding quantity according to a plurality of labeling areas obtained by dividing; and sharing the labeling information to each labeling person in real time, wherein the labeling information is content of labeling the remote sensing image to be labeled by a plurality of labeling persons by using a labeling tool.
According to an embodiment of the present invention, sharing the labeling information to each labeling person in real time includes: acquiring the labeling content of each labeling person in real time; and sharing the labeling content to each labeling person based on the Web service, wherein the Web service is configured with a visual interface, and each labeling person can acquire the labeling content of other labeling persons through the visual interface.
According to the embodiment of the invention, the remote sensing image to be marked is obtained, and the task creation instruction is generated according to the remote sensing image to be marked, and the task creation instruction comprises: determining a task type according to user requirements; determining a labeling mode corresponding to the task type according to the task type; and generating a task creation instruction according to the task type and the labeling mode.
According to an embodiment of the invention, the task types include: one of object detection recognition, semantic segmentation, and change detection.
According to an embodiment of the present invention, an annotation tool comprises: one of a rectangular marking tool, a polygonal marking tool, a straight line marking tool, a point marking tool and a circular marking tool.
According to the embodiment of the invention, the collaborative concurrency labeling method based on the region association further comprises the following steps: checking the marked information; and warehousing and storing the marked information after the verification is passed.
According to the embodiment of the invention, before the label information passing the verification is put into storage and stored, the method further comprises the following steps: and performing sample clipping, sample normalization and truth value visualization processing on the labeling information.
According to the embodiment of the invention, under the condition that the task type is change detection, the first image and the second image are simultaneously displayed on the visual interface of the Web service, and the first image and the second image can be linked to display the annotation content, wherein the second image is an image which changes relative to the first image.
According to an embodiment of the present invention, obtaining a remote sensing image to be annotated includes: and carrying out image adjustment, band selection and pyramid generation on the remote sensing image to be marked in advance.
A second aspect of the present invention provides a collaborative concurrency annotation system based on regional federation, comprising: the acquisition module is used for acquiring the remote sensing image to be marked and generating a task creation instruction according to the remote sensing image to be marked; the division module is used for responding to the task creation instruction, dividing the labeling areas of the remote sensing image to be labeled, and determining a plurality of labeling personnel with corresponding quantity according to the plurality of labeling areas obtained by division; and the sharing module is used for sharing the labeling information to each labeling person in real time, wherein the labeling information is the content of labeling the remote sensing image to be labeled by a plurality of labeling persons by using a labeling tool.
According to the collaborative concurrency labeling method and the collaborative concurrency labeling system based on the regional association, provided by the invention, by the labeling mode of performing multi-region division task allocation on the remote sensing image to be labeled, a plurality of labeling personnel can be used for collaborative concurrency labeling, meanwhile, task visualization is realized based on Web services, the labeling efficiency and quality can be greatly improved by utilizing artificial intelligence to assist labeling, and the technical problem that the large-scale remote sensing image is difficult to label in the prior art is solved.
Drawings
The foregoing and other objects, features and advantages of the invention will be apparent from the following description of embodiments of the invention with reference to the accompanying drawings, in which:
FIG. 1 schematically illustrates a flow chart of a collaborative concurrency annotation method based on region federation in accordance with an embodiment of the invention;
FIG. 2 schematically illustrates a flow frame diagram for implementing the method based on a Web service according to an embodiment of the invention;
FIG. 3 schematically illustrates a block diagram of a collaborative concurrency annotation system based on region federation in accordance with an embodiment of the invention.
Detailed Description
The present invention will be further described in detail below with reference to specific embodiments and with reference to the accompanying drawings, in order to make the objects, technical solutions and advantages of the present invention more apparent. It will be apparent that the described embodiments are some, but not all, embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention. The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. The terms "comprises," "comprising," and/or the like, as used herein, specify the presence of stated features, steps, operations, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, or components.
In the present invention, unless explicitly specified and limited otherwise, the terms "mounted," "connected," "secured," and the like are to be construed broadly, and may be, for example, fixedly connected, detachably connected, or integrally formed; may be mechanically connected, may be electrically connected or may communicate with each other; can be directly connected or indirectly connected through an intermediate medium, and can be communicated with the inside of two elements or the interaction relationship of the two elements. The specific meaning of the above terms in the present invention can be understood by those of ordinary skill in the art according to the specific circumstances.
In the description of the present invention, it should be understood that the terms "longitudinal," "length," "circumferential," "front," "rear," "left," "right," "top," "bottom," "inner," "outer," and the like indicate an orientation or a positional relationship based on that shown in the drawings, merely for convenience in describing the present invention and simplifying the description, and do not indicate or imply that the subsystem or element in question must have a specific orientation, be configured and operated in a specific orientation, and thus should not be construed as limiting the present invention.
Like elements are denoted by like or similar reference numerals throughout the drawings. Conventional structures or constructions will be omitted when they may cause confusion in the understanding of the invention. And the shape, size and position relation of each component in the figure do not reflect the actual size, proportion and actual position relation.
Similarly, in the description of exemplary embodiments of the invention above, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. The description of the terms "one embodiment," "some embodiments," "example," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiments or examples. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. In the description of the present invention, the meaning of "plurality" means at least two, for example, two, three, etc., unless specifically defined otherwise.
Where expressions like at least one of "A, B and C, etc. are used, the expressions should generally be interpreted in accordance with the meaning as commonly understood by those skilled in the art (e.g.," a system having at least one of A, B and C "shall include, but not be limited to, a system having a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.).
In the technical scheme of the invention, the related data (such as including but not limited to personal information of a user) are collected, stored, used, processed, transmitted, provided, disclosed, applied and the like, all meet the requirements of related laws and regulations, necessary security measures are adopted, and the public welcome is not violated.
FIG. 1 schematically illustrates a flow chart of a collaborative concurrency labeling method based on region federation according to an embodiment of the present invention.
As shown in FIG. 1, the collaborative concurrency labeling method based on region association in this embodiment includes operations S1-S3.
In operation S1, a remote sensing image to be marked is obtained, and a task creation instruction is generated according to the remote sensing image to be marked.
In this embodiment, firstly, a remote sensing image dataset is obtained from a database, a remote sensing image to be marked is determined, and image adjustment, band selection and pyramid generation processing are performed on the remote sensing image to be marked in advance. Before image annotation, the remote sensing image is adjusted to improve the visual effect of the image, so that the remote sensing image is more suitable for annotation and analysis, for example, the brightness, contrast, tone and the like are adjusted to enable the target to be more clear and visible, the interference of background noise is reduced, and the accuracy of the annotation is improved. The wave band selection is aimed at the condition of multispectral or hyperspectral remote sensing images, interested wave bands are selected for display and analysis according to task requirements, the characteristics of the targets can be highlighted, and labeling personnel can be helped to better understand and label the targets in the images. The pyramid generation is to improve the display performance and interactivity of the image, and the image pyramid is generated before labeling, so that images can be loaded and displayed according to detail requirements of different levels, and the labeling efficiency and user experience are improved.
The task type is then determined according to the user's needs, which may be, for example, one of object detection recognition, semantic segmentation, and change detection.
Further, a corresponding labeling mode is determined according to the task type, for example, a rectangular bounding box is usually used for target detection and identification to represent the position of a target, a color mask or pixel-level labeling is usually used for semantic segmentation to represent the category of each pixel, and a change detection focuses on a region which changes in a remote sensing image, and remote sensing images at different time points need to be compared for labeling. And finally, generating a task creation instruction according to the determined task type and the labeling mode.
In operation S2, in response to the task creation instruction, labeling areas of the remote sensing image to be labeled are divided, and a corresponding number of labeling personnel are determined according to the multiple labeling areas obtained by the division.
In this embodiment, a task is created in response to a task creation instruction, a remote sensing image to be marked is divided into regions according to the size of the remote sensing image to be marked, a plurality of marking personnel with corresponding number are allocated according to the plurality of marked regions obtained by dividing, for example, the remote sensing image to be marked is divided into 9 regions according to 3×3, and then the 9 marking personnel are allocated correspondingly for marking.
In operation S3, the labeling information is shared to each labeling person in real time, where the labeling information is content that a plurality of labeling persons label the remote sensing image to be labeled by using a labeling tool.
In the embodiment, the labeling content of each labeling person is acquired in real time, and then the labeling content is shared to each labeling person based on the Web service, wherein the Web service is configured with a visual interface, and each labeling person can acquire the labeling content of other labeling persons through the visual interface.
According to an embodiment of the present invention, the collaborative concurrency labeling method based on region association in this embodiment may further include operations S4 to S5:
in operation S4, the labeling information is audited.
In this embodiment, the remote sensing image with the marked is checked, and if a problem is found, communication and correction are performed with the marking personnel.
And in operation S5, warehousing and storing the marked information after the verification is passed.
In this embodiment, sample clipping, sample normalization and true value visualization are performed on the label information after the verification is passed. According to the requirement, the labeling result can be subjected to cutting operation, and the region of interest is extracted as a sample, for example, for a remote sensing image, cutting can be performed according to the labeled building region, so that an independent building sample is obtained. To ensure consistency and comparability of the samples, normalization processing may be performed on the samples, for example, for the image samples, operations such as size unification, brightness adjustment, color space conversion, etc. may be performed to ensure that the samples have similar feature representations. In order to facilitate the user to check and verify the labeling result, the labeling result and the original image can be overlapped or visualized, so that the labeling area and the label can be intuitively displayed, and the user can be helped to understand and use the sample data set. And storing the processed remote sensing image into a vector database for subsequent remote sensing data analysis and application, wherein the vector database can store geometric information and attribute information of the marked area.
According to the embodiment of the invention, under the condition that the task type is change detection, the first image and the second image are simultaneously displayed on the visual interface of the Web service, and the first image and the second image can be linked to display the annotation content, wherein the second image is an image which changes relative to the first image.
In the real-time linkage display labeling process, when a labeling person labels one image, the other image can correspondingly display a labeling result, so that the labeling person can intuitively observe the difference between the two images and accurately label a changed region. The real-time linkage display labeling mode can improve labeling efficiency and accuracy, so that a labeling person can better understand the change between images and label the changed region more accurately.
Fig. 2 schematically shows a flow-chart framework for implementing the method based on a Web service according to an embodiment of the invention.
As shown in fig. 2, the existing remote sensing image labeling systems are designed based on a single client, are developed according to the labeling requirements of a small amount of sample data, cannot effectively organize and distribute large-scale data, and seriously affect the labeling efficiency of the data. In the embodiment, a labeling method based on real-time multi-person region division task distribution of Web service is adopted, and an artificial intelligence auxiliary labeling process is utilized to improve labeling efficiency and quality.
The user logs in the system through the user name and the password to obtain the access right. And then creating a task according to the user demand, dividing the labeling areas of the remote sensing image to be labeled by the server corresponding to the task creation instruction, and determining a plurality of labeling personnel with corresponding quantity according to the plurality of labeling areas obtained by dividing. The platform manager pushes the remote sensing image task to be marked to the marking personnel, the marking personnel marks the remote sensing image to be marked by using the marking tool according to the relevant requirements and the expiration date of the task, the visual interface of the Web service can display the marking content of the marking personnel to other marking personnel marking the image, and meanwhile, the marking personnel can process a plurality of marking tasks, and different tasks are switched to carry out marking work through the visual interface of the Web service, so that the marking efficiency is improved. The labeling tools that can be used by the labeling personnel include: one of a rectangular marking tool, a polygonal marking tool, a straight line marking tool, a point marking tool and a circular marking tool. When the task type is change detection, two images are supported to be displayed in a real-time linkage mode. After the labeling personnel completes labeling, the labeling result is submitted to the server for auditing. The quality inspection and review of the labeling results may be performed by a platform administrator or by a specialized auditor. The platform manager or auditor can randomly extract a part of labeling samples for auditing, and check labeling quality and consistency. If the problem is found, communication and correction are carried out with the labeling personnel. The audited labeling results are stored in a vector database for subsequent remote sensing data analysis and application, and the vector database can store geometric information and attribute information of the labeling area.
FIG. 3 schematically illustrates a block diagram of a collaborative concurrency annotation system based on region federation in accordance with an embodiment of the invention.
As shown in fig. 3, the collaborative concurrency annotation system based on region association of this embodiment includes: an acquisition module 301, a division module 302 and a sharing module 303.
The obtaining module 301 is configured to obtain a remote sensing image to be annotated, and generate a task creation instruction according to the remote sensing image to be annotated.
The dividing module 302 is configured to divide a labeling area of a remote sensing image to be labeled in response to a task creation instruction, and determine a corresponding number of labeling personnel according to the multiple labeling areas obtained by the division.
The sharing module 303 is configured to share labeling information to each labeling person in real time, where the labeling information is content that a plurality of labeling persons label a remote sensing image to be labeled by using a labeling tool.
Any of the plurality of modules of the acquisition module 301, the division module 302, and the sharing module 303 may be combined in one module to be implemented, or any of the plurality of modules may be split into a plurality of modules according to an embodiment of the present invention. Alternatively, at least some of the functionality of one or more of the modules may be combined with at least some of the functionality of other modules and implemented in one module. According to an embodiment of the present invention, at least one of the acquisition module 301, the partitioning module 302, and the sharing module 303 may be implemented at least in part as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or in hardware or firmware in any other reasonable way of integrating or packaging the circuits, or in any one of or a suitable combination of any of the three. Alternatively, at least one of the acquisition module 301, the division module 302 and the sharing module 303 may be at least partially implemented as a computer program module, which when executed may perform the respective functions.
According to the collaborative concurrency labeling system based on regional association, provided by the invention, a plurality of labeling personnel can be used for collaborative concurrency labeling by performing the labeling mode of multi-region division task allocation on the remote sensing image to be labeled, meanwhile, task visualization is realized based on Web service, the labeling efficiency and quality can be greatly improved by utilizing artificial intelligence to assist labeling, and the technical problem that the large-scale remote sensing image is difficult to label in the prior art is solved.
It should be noted that, in the embodiment of the present invention, the collaborative concurrency labeling system based on region association corresponds to the collaborative concurrency labeling method based on region association in the embodiment of the present invention, and the specific implementation details and the technical effects brought by the same are the same, which are not described herein again.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems and methods according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Those skilled in the art will appreciate that the features recited in the various embodiments of the invention can be combined in a variety of combinations and/or combinations, even if such combinations or combinations are not explicitly recited in the present invention. In particular, the features recited in the various embodiments of the invention can be combined and/or combined in various ways without departing from the spirit and teachings of the invention. All such combinations and/or combinations fall within the scope of the invention.
The embodiments of the present invention are described above. However, these examples are for illustrative purposes only and are not intended to limit the scope of the present invention. Although the embodiments are described above separately, this does not mean that the measures in the embodiments cannot be used advantageously in combination. Various alternatives and modifications can be made by those skilled in the art without departing from the scope of the invention, and such alternatives and modifications are intended to fall within the scope of the invention.

Claims (10)

1. The collaborative concurrency labeling method based on region association is characterized by comprising the following steps of:
acquiring a remote sensing image to be annotated, and generating a task creation instruction according to the remote sensing image to be annotated;
responding to the task creation instruction, dividing the labeling areas of the remote sensing image to be labeled, and determining a plurality of labeling personnel with corresponding quantity according to the plurality of labeling areas obtained by dividing;
and sharing the labeling information to each labeling person in real time, wherein the labeling information is the content of labeling the remote sensing image to be labeled by the plurality of labeling persons by using a labeling tool.
2. The collaborative concurrency labeling method based on regional federation of claim 1, wherein the sharing of labeling information to each labeling person in real time comprises:
acquiring the labeling content of each labeling person in real time;
and sharing the labeling content to each labeling person based on a Web service, wherein the Web service is configured with a visual interface, and each labeling person can acquire the labeling content of other labeling persons through the visual interface.
3. The collaborative concurrency labeling method based on regional federation according to claim 2, wherein the obtaining a remote sensing image to be labeled and generating a task creation instruction according to the remote sensing image to be labeled comprises:
determining a task type according to user requirements;
determining a labeling mode corresponding to the task type according to the task type;
and generating a task creation instruction according to the task type and the labeling mode.
4. The collaborative concurrency labeling method based on regional federation of claim 3, wherein the task types include: one of object detection recognition, semantic segmentation, and change detection.
5. The collaborative concurrency labeling method based on region federation of claim 1, wherein the labeling tool comprises: one of a rectangular marking tool, a polygonal marking tool, a straight line marking tool, a point marking tool and a circular marking tool.
6. The collaborative concurrency labeling method based on regional federation of claim 1, further comprising:
auditing the labeling information;
and warehousing and storing the marked information after the verification is passed.
7. The collaborative concurrency labeling method based on regional association according to claim 6, wherein before the labeling information passing the audit is stored in a warehouse, the method further comprises:
and performing sample clipping, sample normalization and truth value visualization processing on the labeling information.
8. The collaborative concurrency labeling method based on regional federation according to claim 4, wherein in the case that the task type is change detection, the visual interface of the Web service simultaneously displays a first image and a second image, and the first image and the second image can display labeling content in a linkage manner, wherein the second image is an image that changes relative to the first image.
9. The collaborative concurrency labeling method based on regional federation according to claim 1, wherein the obtaining a remote sensing image to be labeled comprises:
and carrying out image adjustment, band selection and pyramid generation processing on the remote sensing image to be annotated in advance.
10. A collaborative concurrency annotation system based on regional federation, comprising:
the acquisition module is used for acquiring a remote sensing image to be marked and generating a task creation instruction according to the remote sensing image to be marked;
the dividing module is used for responding to the task creation instruction, dividing the labeling areas of the remote sensing image to be labeled, and determining a plurality of labeling personnel with corresponding quantity according to the plurality of labeling areas obtained by dividing;
and the sharing module is used for sharing the labeling information to each labeling person in real time, wherein the labeling information is the content of labeling the remote sensing image to be labeled by the plurality of labeling persons by using a labeling tool.
CN202311609019.1A 2023-11-29 2023-11-29 Collaborative concurrency labeling method and system based on regional association Pending CN117315494A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311609019.1A CN117315494A (en) 2023-11-29 2023-11-29 Collaborative concurrency labeling method and system based on regional association

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311609019.1A CN117315494A (en) 2023-11-29 2023-11-29 Collaborative concurrency labeling method and system based on regional association

Publications (1)

Publication Number Publication Date
CN117315494A true CN117315494A (en) 2023-12-29

Family

ID=89255699

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311609019.1A Pending CN117315494A (en) 2023-11-29 2023-11-29 Collaborative concurrency labeling method and system based on regional association

Country Status (1)

Country Link
CN (1) CN117315494A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118428886A (en) * 2024-04-30 2024-08-02 广州水木星尘信息科技有限公司 Cooperative processing platform and cooperative processing system for tunnel data
CN118587711A (en) * 2024-08-07 2024-09-03 陕西航天技术应用研究院有限公司 AI large model distributed remote sensing sample construction and auditing method and system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111580947A (en) * 2020-04-29 2020-08-25 中国科学院空天信息创新研究院 Online collaborative remote sensing image annotation system based on artificial intelligence
CN113034025A (en) * 2021-04-08 2021-06-25 成都国星宇航科技有限公司 Remote sensing image annotation system and method
CN113220920A (en) * 2021-06-01 2021-08-06 中国电子科技集团公司第五十四研究所 Satellite remote sensing image sample labeling system and method based on micro-service architecture
CN114386152A (en) * 2022-01-14 2022-04-22 中国电建集团昆明勘测设计研究院有限公司 Lightweight BIM model linkage labeling system based on WebSocket
CN116524373A (en) * 2023-05-10 2023-08-01 北京山海础石信息技术有限公司 Remote sensing image sample labeling all-in-one machine and method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111580947A (en) * 2020-04-29 2020-08-25 中国科学院空天信息创新研究院 Online collaborative remote sensing image annotation system based on artificial intelligence
CN113034025A (en) * 2021-04-08 2021-06-25 成都国星宇航科技有限公司 Remote sensing image annotation system and method
CN113220920A (en) * 2021-06-01 2021-08-06 中国电子科技集团公司第五十四研究所 Satellite remote sensing image sample labeling system and method based on micro-service architecture
CN114386152A (en) * 2022-01-14 2022-04-22 中国电建集团昆明勘测设计研究院有限公司 Lightweight BIM model linkage labeling system based on WebSocket
CN116524373A (en) * 2023-05-10 2023-08-01 北京山海础石信息技术有限公司 Remote sensing image sample labeling all-in-one machine and method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118428886A (en) * 2024-04-30 2024-08-02 广州水木星尘信息科技有限公司 Cooperative processing platform and cooperative processing system for tunnel data
CN118587711A (en) * 2024-08-07 2024-09-03 陕西航天技术应用研究院有限公司 AI large model distributed remote sensing sample construction and auditing method and system

Similar Documents

Publication Publication Date Title
US8699769B2 (en) Generating artificial hyperspectral images using correlated analysis of co-registered images
CN110059697B (en) Automatic lung nodule segmentation method based on deep learning
CN117315494A (en) Collaborative concurrency labeling method and system based on regional association
Chen et al. Building detection in an urban area using lidar data and QuickBird imagery
US8855427B2 (en) Systems and methods for efficiently and accurately detecting changes in spatial feature data
US10115165B2 (en) Management of tax information based on topographical information
CN107506499A (en) The method, apparatus and server of logical relation are established between point of interest and building
WO2022247823A1 (en) Image detection method, and device and storage medium
CN113033516A (en) Object identification statistical method and device, electronic equipment and storage medium
CN115909059A (en) Natural resource sample library establishing method and device
CN114399784A (en) Automatic identification method and device based on CAD drawing
EP2447884A1 (en) Method for detecting and recognising an object in an image, and an apparatus and a computer program therefor
CN112883926A (en) Identification method and device for table medical images
US9881210B2 (en) Generating a computer executable chart visualization by annotating a static image
CN117593420A (en) Plane drawing labeling method, device, medium and equipment based on image processing
CN117391863A (en) Investment project management analysis method, system, readable storage medium and computer
US10055811B2 (en) System and method for generating interactive 2D projection of 3D model
CN114139266A (en) Wall body node full-page proof automatic design method based on dynamic combination and storage medium
CN114116686A (en) Data visualization method for realizing data large screen
Palma et al. Enhanced visualization of detected 3d geometric differences
Seebach et al. Identifying strengths and limitations of pan-European forest cover maps through spatial comparison
CN117523417B (en) Method and electronic equipment applied to unified right-confirming registration of natural resources
CN118227726B (en) Geographic Information System (GIS) display method, device and storage medium
CN117391643B (en) Knowledge graph-based medical insurance document auditing method and system
CN111859052B (en) Grading display method and system for field investigation result

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20231229

RJ01 Rejection of invention patent application after publication