Skip to content
This repository has been archived by the owner on Dec 4, 2023. It is now read-only.

Tuning cores and memory needed by Spark executors #22

Open
ppatierno opened this issue Apr 23, 2018 · 0 comments
Open

Tuning cores and memory needed by Spark executors #22

ppatierno opened this issue Apr 23, 2018 · 0 comments

Comments

@ppatierno
Copy link
Member

In order to share the Spark cluster between more spark-driver applications we need to tune the parameters for the spark-submit command related to cores per executors (--executor-cores, total-executor-core, --executor-memory...). The example start a one node Spark cluster with 8 cores.
The same should be considered for memory.

In the current status, the first spark-driver gets all 8 available cores and another one cannot run.

@ppatierno ppatierno changed the title Tuning cores and memory needed by executors Tuning cores and memory needed by Spark executors Apr 23, 2018
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant