Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for ConfigMap files inside of source repositories #279

Open
elmiko opened this issue Oct 29, 2018 · 1 comment
Open

Add support for ConfigMap files inside of source repositories #279

elmiko opened this issue Oct 29, 2018 · 1 comment

Comments

@elmiko
Copy link
Contributor

elmiko commented Oct 29, 2018

As a user, i often modify the configuration settings for each application i create. This can lead to a confusing number of different oc new-app ... scripts that i need to write in order to capture each configuration. i would like to be able to capture these settings in my source repository and have the oshinko tooling automatically deploy a cluster configuration for me.

example

I have an application that uses several external packages, when i want to run this application i might type the following:

oc new-app --template=oshinko-java-spark-build-dc \
           -p APPLICATION_NAME=myapp \
           -p GIT_URI=https://github.com/me/myapp \
           -p APP_MAIN_CLASS=com.me.myapp.Main \
           -p SPARK_OPTIONS='--packages org.apache.spark:spark-sql-kafka-0-10_2.11:2.3.0,com.sparkjava:spark-core:2.5.5,org.glassfish:javax.json:1.0.4  --conf spark.jars.ivy=/tmp/.ivy2'

the SPARK_OPTIONS command captures a specific configuration that i need for this application. i will need to use this command line every time i launch the application. what i would like is to capture these settings in a file that i can check in to my repository that oshinko will use to configure my cluster.

i would like to be to check in a file, oshinko-cluster.conf for example, that will contain configurations that will get deployed for me. the options from the previous command line could then get captured in a file like this:

oshinko-cluster.conf

spark.jars.packages    org.apache.spark:spark-sql-kafka-0-10_2.11:2.3.0,com.sparkjava:spark-core:2.5.5,org.glassfish:javax.json:1.0.4
spark.jars.ivy         /tmp/.ivy2

implementation

although the previous example is over-simplified, it could form the basis of thinking for an implementation. it would be nice to create some sort of linkage between files in the source repository and the ConfigMap options currently in use by oshinko.

there could be a mechanism to convert config files directory into ConfigMaps, or perhaps a way to include ConfigMap yaml files inside the repo.

regardless of the implementation chosen, this feature will bring another layer of devops coordination to the oshinko s2i process and make the tooling even easier for users.

@elmiko
Copy link
Contributor Author

elmiko commented Oct 29, 2018

just to clarify, what i mainly want to be able to automate the process of making ConfigMaps for my spark clusters, and combine that with the ability to check-in said configurations.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant