Skip to content

Commit

Permalink
add menu entry items for how do i articles
Browse files Browse the repository at this point in the history
  • Loading branch information
elmiko committed Jul 27, 2018
1 parent 7690337 commit 7488266
Show file tree
Hide file tree
Showing 6 changed files with 6 additions and 0 deletions.
1 change: 1 addition & 0 deletions _howdoi/how-do-i-launch-a-jupyter-notebook.adoc
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
= launch a Jupyter notebook on OpenShift
:page-layout: howdoi
:page-menu_entry: How do I?

There are multiple ways to launch a Jupyter notebook on OpenShift with the
radanalytics.io tooling. You can use the OpenShift console or the `oc` command
Expand Down
1 change: 1 addition & 0 deletions _howdoi/how-do-i-recognize-version-clash.adoc
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
= recognize Spark version mismatch between driver, master and/or workers?
:page-layout: howdoi
:page-menu_entry: How do I?

It's important that the Spark version running on your driver, master, and
worker pods all match. Although some versions _might actually_ interoperate,
Expand Down
1 change: 1 addition & 0 deletions _howdoi/how-to-connect-to-cluster.adoc
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
= connect to a cluster to debug / develop?
:page-layout: howdoi
:page-menu_entry: How do I?

[source,bash]
oc run -it --rm dev-shell --image=radanalyticsio/openshift-spark -- spark-shell
1 change: 1 addition & 0 deletions _howdoi/how-to-connect-to-kafka.adoc
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
= connect to Apache Kafka?
:page-layout: howdoi
:page-menu_entry: How do I?

You need to add `--packages org.apache.spark:spark-sql-kafka-0-10_2.11:2.1.0`
when running `spark-shell`, `spark-submit` or to `SPARK_OPTIONS` for S2I. For
Expand Down
1 change: 1 addition & 0 deletions _howdoi/how-to-use-spark-configs.adoc
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
= use custom Spark configuration files with my cluster?
:page-layout: howdoi
:page-menu_entry: How do I?

Create custom versions of standard Spark configuration files such as `spark-defaults.conf`
or `spark-env.sh` and put them together in a subdirectory, then create a configmap
Expand Down
1 change: 1 addition & 0 deletions _howdoi/use-python-packages.adoc
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
= install Python packages in Jupyter notebooks on OpenShift
:page-layout: howdoi
:page-menu_entry: How do I?
:source-highlighter: coderay
:coderay-css: style

Expand Down

0 comments on commit 7488266

Please sign in to comment.