Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Panic exception occurs when using skywalking-go sdk #31439

Closed
Donghui0 opened this issue Feb 27, 2024 · 5 comments
Closed

Panic exception occurs when using skywalking-go sdk #31439

Donghui0 opened this issue Feb 27, 2024 · 5 comments
Assignees
Labels

Comments

@Donghui0
Copy link

Donghui0 commented Feb 27, 2024

Component(s)

receiver/skywalking

What happened?

Description

Sporadic panic problem:
panic: runtime error: invalid memory address or nil pointer dereference
[signal SIGSEGV: segmentation violation code=0x1 addr=0x1d pc=0x884742]

goroutine 68 [running]:
google.golang.org/protobuf/internal/impl.sizeMessageSliceInfo({0x269?}, 0xc00e220748, {0xf0?})
/data/repo_cache/golang/mod/google.golang.org/[email protected]/internal/impl/codec_field.go:472 +0x42
google.golang.org/protobuf/internal/impl.(*MessageInfo).sizePointerSlow(0xc000366e58, {0x20000000000001e?}, {0x3?})
/data/repo_cache/golang/mod/google.golang.org/[email protected]/internal/impl/encode.go:79 +0xd9
google.golang.org/protobuf/internal/impl.(*MessageInfo).sizePointer(0xc00a76adc0?, {0xc00cb22320?}, {0x58?})
/data/repo_cache/golang/mod/google.golang.org/[email protected]/internal/impl/encode.go:56 +0x76
google.golang.org/protobuf/internal/impl.sizeMessageSliceInfo({0x200000000419dcb?}, 0xc00151ee60, {0x84?})
/data/repo_cache/golang/mod/google.golang.org/[email protected]/internal/impl/codec_field.go:473 +0x55
google.golang.org/protobuf/internal/impl.(*MessageInfo).sizePointerSlow(0xc000366d10, {0xc000046077?}, {0x76?})
/data/repo_cache/golang/mod/google.golang.org/[email protected]/internal/impl/encode.go:79 +0xd9
google.golang.org/protobuf/internal/impl.(*MessageInfo).sizePointer(0x284?, {0xd?}, {0x80?})
/data/repo_cache/golang/mod/google.golang.org/[email protected]/internal/impl/encode.go:56 +0x76
google.golang.org/protobuf/internal/impl.sizeMessageSliceInfo({0x7efedb8ab1b8?}, 0xc000295990, {0x68?})
/data/repo_cache/golang/mod/google.golang.org/[email protected]/internal/impl/codec_field.go:473 +0x55
google.golang.org/protobuf/internal/impl.(*MessageInfo).sizePointerSlow(0xc000366a80, {0xc009e61a68?}, {0x80?})
/data/repo_cache/golang/mod/google.golang.org/[email protected]/internal/impl/encode.go:79 +0xd9
google.golang.org/protobuf/internal/impl.(*MessageInfo).sizePointer(0xc009e61ad8?, {0x9?}, {0x80?})
/data/repo_cache/golang/mod/google.golang.org/[email protected]/internal/impl/encode.go:56 +0x76
google.golang.org/protobuf/internal/impl.(*MessageInfo).size(0x40802d?, {{}, {0x286dcb8?, 0xc00bbf7950?}, 0x5?})
/data/repo_cache/golang/mod/google.golang.org/[email protected]/internal/impl/encode.go:40 +0x52
google.golang.org/protobuf/proto.MarshalOptions.marshal({{}, 0x50?, 0x0, 0x0}, {0x0, 0x0, 0x0}, {0x286dcb8, 0xc00bbf7950})
/data/repo_cache/golang/mod/google.golang.org/[email protected]/proto/encode.go:156 +0x142
google.golang.org/protobuf/proto.MarshalOptions.MarshalAppend({{}, 0x60?, 0x63?, 0x1e?}, {0x0, 0x0, 0x0}, {0x285a880?, 0xc00bbf7950?})
/data/repo_cache/golang/mod/google.golang.org/[email protected]/proto/encode.go:125 +0x79
github.com/golang/protobuf/proto.marshalAppend({0x0, 0x0, 0x0}, {0x7efeb03f7900?, 0xc00bbf7950?}, 0x80?)
/data/repo_cache/golang/mod/github.com/golang/[email protected]/proto/wire.go:40 +0xa5
github.com/golang/protobuf/proto.Marshal(...)
/data/repo_cache/golang/mod/github.com/golang/[email protected]/proto/wire.go:23
google.golang.org/grpc/encoding/proto.codec.Marshal({}, {0x21e6360, 0xc00bbf7950})
/data/repo_cache/golang/mod/google.golang.org/[email protected]/encoding/proto/proto.go:45 +0x4e
google.golang.org/grpc.encode({0x7efedb8b28f8?, 0x3952e88?}, {0x21e6360?, 0xc00bbf7950?})
/data/repo_cache/golang/mod/google.golang.org/[email protected]/rpc_util.go:632 +0x44
google.golang.org/grpc.prepareMsg({0x21e6360?, 0xc00bbf7950?}, {0x7efedb8b28f8?, 0x3952e88?}, {0x0, 0x0}, {0x0, 0x0})
/data/repo_cache/golang/mod/google.golang.org/[email protected]/stream.go:1754 +0xd2
google.golang.org/grpc.(*clientStream).SendMsg(0xc00ca3e900, {0x21e6360, 0xc00bbf7950})
/data/repo_cache/golang/mod/google.golang.org/[email protected]/stream.go:875 +0x1b5
skywalking.apache.org/repo/goapi/collect/language/agent/v3.(*traceSegmentReportServiceCollectClient).Send(0x2863028?, 0x2863098?)
/data/repo_cache/golang/mod/skywalking.apache.org/repo/[email protected]/collect/language/agent/v3/Tracing_grpc.pb.go:61 +0x2b
github.com/apache/skywalking-go/agent/reporter.(*gRPCReporter).initSendPipeline.func1()
grpc.go:323 +0x1eb
created by github.com/apache/skywalking-go/agent/reporter.(*gRPCReporter).initSendPipeline
grpc.go:305 +0x6

Steps to Reproduce

Occasional problems.

Collector version

0.90.1 (09b5ec2)

Environment information

Environment

OS: centos7
Compiler(if manually compiled): 1.20.1

OpenTelemetry Collector configuration

extensions:
  health_check:
    endpoint: 0.0.0.0:13133
    path: "/healthz"
  zpages:
    endpoint: 0.0.0.0:55679
receivers:
  skywalking:
    protocols:
      grpc:
        endpoint: "0.0.0.0:11800"
      http:
        endpoint: "0.0.0.0:12800"
exporters:
  logging/detail:
    loglevel: info
  loadbalancing:
    protocol:
      otlp:
        timeout: 4s
        sending_queue: 
          enabled: true
          num_consumers: 10 
          queue_size: 1000000 
        retry_on_failure:
          enabled: false 
        tls:
          insecure: true
    resolver:
      dns:
        hostname: otel-process-headless-svc
        port: 55680
processors:
  batch:
    timeout: 3s
    send_batch_size: 4096

service:
  pipelines:
    traces:
      receivers: [skywalking]
      processors: [batch]
      exporters: [loadbalancing]
  extensions: [health_check]
  telemetry:
    metrics:
      address: 0.0.0.0:2888

Log output

not log output

Additional context

No response

@Donghui0 Donghui0 added bug Something isn't working needs triage New item requiring triage labels Feb 27, 2024
Copy link
Contributor

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@crobert-1 crobert-1 added the priority:p1 High label Feb 27, 2024
@JaredTan95
Copy link
Member

@Donghui0 Which version skywalking-go sdk you used?

@Donghui0
Copy link
Author

Donghui0 commented Mar 4, 2024

@Donghui0 Which version skywalking-go sdk you used?

use v0.3.0

@JaredTan95
Copy link
Member

@Donghui0 hi, from your logs pasted, skywalkingreceiver use skywalking.apache.org/repo/goapi v0.0.0-20231026090926-09378dd56587 in otel col contrib 0.90.1 , but your skywalking-go-sdk use 20230314034821, I'm not sure if it's a version problem, because I haven't gone to self-study yet.

otel col contrib main upgrade goapi v0.0.0-20240104145220-ba7202308dd4, I will replicate and fix it based on the latest code.

Copy link
Contributor

github-actions bot commented Jun 7, 2024

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@github-actions github-actions bot added the Stale label Jun 7, 2024
@Donghui0 Donghui0 closed this as completed Jul 8, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants