Skip to content
View Zhang-Each's full-sized avatar
🍋
Just so so!
🍋
Just so so!
Block or Report

Block or report Zhang-Each

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Please don't include any personal information such as legal names or email addresses. Maximum 100 characters, markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse
Zhang-Each/README.md

About ME

  • I'm Zhang-Each
    • A graduate student in Zhejiang University, major in Computer Science
    • Study Knowledge Graphs(KG) and Natural Language Processing(NLP) in ZJU-KG lab.
  • Learning open courses released by Stanford/MIT/CMU
  • Blog: link here
  • Notebook: link here
  • Personal Page: link here

Publication

  • Knowledge Graph Completion with Pre-trained Multimodal Transformer and Twins Negative Sampling. (First Author, Accepted by KDD-2022 Undergraduate consortium, ArXiv)
  • Tele-Knowledge Pre-training for Fault Analysis. (Accepted by ICDE-2023 Industry Track, ArXiv)
  • Modality-Aware Negative Sampling for Multi-modal Knowledge Graph Embedding. (Accepted by IJCNN 2023, ArXiv)
  • CausE: Towards Causal Knowledge Graph Embedding. (Accepted by CCKS 2023, ArXiv)
  • MACO: A Modality Adversarial and Contrastive Framework for Modality-missing Multi-modal Knowledge Graph Completion. (Accepted by NLPCC 2023, ArXiv)
  • Unleashing the Power of Imbalanced Modality Information for Multi-modal Knowledge Graph Completion. (Accepted by COLING 2024, ArXiV)
  • NativE: Multi-modal Knowledge Graph Completion in the Wild. (Accepted by SIGIR 2024, ArXiV).
  • Knowledgeable Preference Alignment for LLMs in Domain-specific Question Answering. (Accepted by ACL 2024 Findings, ArXiv)

Preprint

  • Making Large Language Models Perform Better in Knowledge Graph Completion. (ArXiv)
  • Knowledge Graphs Meet Multi-Modal Learning: A Comprehensive Survey. (ArXiv)
  • MyGO: Discrete Modality Information as Fine-Grained Tokens for Multi-modal Knowledge Graph Completion. (ArXiv)
  • Multi-domain Knowledge Graph Collaborative Pre-training and Prompt Tuning for Diverse Downstream Tasks. (ArXiV)
  • Mixture of Modality Knowledge Experts for Robust Multi-modal Knowledge Graph Completion. (ArXiV)

Stats

Zhang-Each's GitHub Stats Haofei Yu's GitHub Stats

nothing

Pinned Loading

  1. CourseNoteOfZJUSE CourseNoteOfZJUSE Public

    ZJU-SE的一些课程笔记,历年卷,课程经历分享

    395 78

  2. zjukg/KG-MM-Survey zjukg/KG-MM-Survey Public

    Knowledge Graphs Meet Multi-Modal Learning: A Comprehensive Survey

    256 13

  3. zjukg/MyGO zjukg/MyGO Public

    [Paper][Preprint 2024] MyGO: Discrete Modality Information as Fine-Grained Tokens for Multi-modal Knowledge Graph Completion

    Python 202 4

  4. zjukg/KnowPAT zjukg/KnowPAT Public

    [Paper][ACL 2024 Findings] Knowledgeable Preference Alignment for LLMs in Domain-specific Question Answering

    Python 176 16

  5. zjukg/KoPA zjukg/KoPA Public

    [Paper][Preprint 2023] Making Large Language Models Perform Better in Knowledge Graph Completion

    Python 120 8

  6. zjukg/NATIVE zjukg/NATIVE Public

    [Paper][SIGIR 2024] NativE: Multi-modal Knowledge Graph Completion in the Wild

    Python 18 1