This repository was archived by the owner on Feb 15, 2022. It is now read-only.
generated from amazon-archives/__template_Apache-2.0
-
Notifications
You must be signed in to change notification settings - Fork 24
This repository was archived by the owner on Feb 15, 2022. It is now read-only.
otel-v1-apm-service-map is always empty #720
Copy link
Copy link
Open
Labels
bugSomething isn't workingSomething isn't working
Description
Describe the bug
The otel-v1-apm-service-map ES index is always empty, despite span data being received.
And data-prepper reporting that it has processed incoming data: "INFO com.amazon.dataprepper.pipeline.ProcessWorker - service-map-pipeline Worker: Processing 1 records from buffer"
Expected behavior
I'd expect the otel-v1-apm-service-map to contain data, OR at least an error in either the data-prepper logs or the ES logs.
Environment (please complete the following information):
AES - 7.10
data-prepper 1.0 (docker)
collector v0.29.0 (docker)
Additional context
#1/bin/bash
from opentelemetry import trace
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk.trace.export import (
ConsoleSpanExporter,
BatchSpanProcessor,
)
from opentelemetry.propagate import set_global_textmap
from opentelemetry.propagators.b3 import B3Format
from opentelemetry.sdk.resources import Resource
from opentelemetry.instrumentation.django import DjangoInstrumentor
from opentelemetry.instrumentation.psycopg2 import Psycopg2Instrumentor
TRACER_NAME = "open-telemetry"
if __name__ == "__main__":
# enable b4 propogation
#set_global_textmap(B3Format())
# build resource/tags
resource = Resource(attributes={
"service.name": "my-local-service",
"space": "platform",
"org": "local",
})
# initiate trace provider
# TODO: add logic to switch between console or otlp
trace.set_tracer_provider(
TracerProvider(resource=resource)
)
#processor = BatchSpanProcessor(ConsoleSpanExporter())
processor = BatchSpanProcessor(
OTLPSpanExporter(endpoint="[redacted url]", insecure=False)
)
trace.get_tracer_provider().add_span_processor(processor)
# this inserts the open-tracing middleware into settings.MIDDLEWARE
# DjangoInstrumentor().instrument()
# psychoPG instructation
#Psycopg2Instrumentor().instrument()
# redis instrumentation
# elastic instrumentation
# celery instrumentation
# set the service name, space and org from VCAP
tracer = trace.get_tracer(__name__)
with tracer.start_as_current_span("foo"):
with tracer.start_as_current_span("bar"):
with tracer.start_as_current_span("baz"):
print("Hello world from OpenTelemetry Python!")
data-prepper pipeline cofig:
entry-pipeline:
delay: "100"
source:
otel_trace_source:
ssl: false
sink:
- pipeline:
name: "raw-pipeline"
- pipeline:
name: "service-map-pipeline"
raw-pipeline:
source:
pipeline:
name: "entry-pipeline"
prepper:
- otel_trace_raw_prepper:
sink:
- elasticsearch:
hosts: ["${ELASTICSEARCH_HOST}" ]
username: "${ELASTICSEARCH_USERNAME}"
password: "${ELASTICSEARCH_PASSWORD}"
trace_analytics_raw: true
service-map-pipeline:
delay: "100"
source:
pipeline:
name: "entry-pipeline"
prepper:
- service_map_stateful:
sink:
- elasticsearch:
hosts: ["${ELASTICSEARCH_HOST}" ]
username: "${ELASTICSEARCH_USERNAME}"
password: "${ELASTICSEARCH_PASSWORD}"
trace_analytics_service_map: true
collector config:
receivers:
otlp:
protocols:
grpc:
endpoint: "0.0.0.0:4317"
http:
processors:
batch:
extensions:
health_check:
exporters:
logging:
logLevel: debug
otlp:
endpoint: localhost:21890
insecure: true
service:
extensions: [health_check]
pipelines:
traces:
receivers: [otlp]
processors: [batch]
exporters: [logging, otlp]
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working