-
Notifications
You must be signed in to change notification settings - Fork 70
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
spark-dependencies can not run normally in docker #90
Comments
The address of the ES in curl is different than the address in the docker run for spark dependencies. Could you run the curl with the same address?
or run the docker with |
yes, |
The spark job can connect to the ES otherwise it would fail. Maybe the data in the index does not contain enough information to create the links. Could you please share some of the data? |
the data is here,please check |
The data looks good, although it would be better to verify it in a different way. I would suggest to run the Jaeger hotrod example https://github.com/jaegertracing/jaeger/tree/master/examples/hotrod and then run the span dependencies jobs. The job should create dependencies links. I am adding example how to run all bits as docker container, but I would suggest you just run the hotrod example and specify the URL to your deployed collector. The logs from hotrod should not contain messages like
Don't forget to open UI on port 8080 and create a couple of transactions to create that will create spans. |
@pavolloffay thanks a lot,i will try this |
I'm running spark-dependencies in a docker container, it seems that it can't run normally which will exit after 5 seconds, my command is
docker run -d --env STORAGE=elasticsearch --env ES_NODES=http://192.167.0.x: 9600 jaegertracing/spark-dependencies:latest
and the log is :
20/06/04 06:59:02 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
20/06/04 06:59:03 INFO ElasticsearchDependenciesJob: Running Dependencies job for 2020-06-04T00:00Z, reading from jaeger-span-2020-06-04 index, result storing to jaeger-dependencies-2020-06-04
20/06/04 06:59:04 INFO ElasticsearchDependenciesJob: Done, 0 dependency objects created
i use this command to verify data:
curl http://localhost:9600/jaeger-span-2020-06-04/_search?pretty=true&q=*:*,
and it gives me this :"took" : 19,
"timed_out" : false,
"_shards" : {
"total" : 1,
"successful" : 1,
"skipped" : 0,
"failed" : 0
},
"hits" : {
"total" : 46,
"max_score" : 1.0,
"hits" : [
{
"_index" : "jaeger-span-2020-06-04",
"_type" : "span",
"_id" : "U0SefnIBHXsa-__vuvXR",
"_score" : 1.0,
"_source" : {
"traceID" : "c75708f073c4caf0",
"spanID" : "c75708f073c4caf0",
"flags" : 1,
"operationName" : "grpc-request",
my jaeger config is:
cat
docker-compose.yaml
version: '2'
services:
jaeger-collector:
image: jaegertracing/jaeger-collector
command:
- '--es.num-shards=1'
- '--es.num-replicas=0'
- '--es.server-urls=http://elasticsearch:9200'
- '--collector.zipkin.http-port=9411'
ports:
- '14269'
- '14268:14268'
- '14250'
- '9411:9411'
environment:
- SPAN_STORAGE_TYPE=elasticsearch
- LOG_LEVEL=debug
restart: on-failure
depends_on:
- elasticsearch
jaeger-query:
image: jaegertracing/jaeger-query
command:
- '--es.num-shards=1'
- '--es.num-replicas=0'
- '--es.server-urls=http://elasticsearch:9200'
ports:
- '16686:16686'
- '16687'
environment:
- SPAN_STORAGE_TYPE=elasticsearch
- LOG_LEVEL=debug
restart: on-failure
depends_on:
- elasticsearch
jaeger-agent:
image: jaegertracing/jaeger-agent
command:
- '--reporter.grpc.host-port=jaeger-collector:14250'
- '--reporter.grpc.retry.max=1000'
ports:
- '5775:5775/udp'
- '6831:6831/udp'
- '6832:6832/udp'
- '5778:5778'
environment:
- LOG_LEVEL=debug
restart: on-failure
depends_on:
- jaeger-collector
elasticsearch:
image: 'docker.elastic.co/elasticsearch/elasticsearch-oss:6.8.3'
environment:
- discovery.type=single-node
ports:
- '9600:9200/tcp'
volumes:
- './esdata:/usr/share/elasticsearch/data'
- './eslog:/usr/share/elasticsearch/logs'
`
please help me with this,thank you~
The text was updated successfully, but these errors were encountered: