Pass the current config location to the forked process upon ssh tunnel connection to kube-api #808
Description
Is this a BUG REPORT or FEATURE REQUEST?:
Uncomment only one, leave it on its own line:
/kind bug
/kind feature
What happened:
On Tarmak 0.6.4, after successfully creating new cluster and tried to kubectl with:
tarmak -c . kubectl get pods
it complains that there is no local configuration found. The following goes in a loop:
tarmak -c . kubectl get pods
INFO[0000] generating terraform code app=tarmak module=terraform
DEBU[0000] created temporary directory: /var/folders/rw/6c489chs0n9gk_lz5fjdtr2m0000gn/T/tarmak-assets998847944
DEBU[0000] restored assets into directory: /var/folders/rw/6c489chs0n9gk_lz5fjdtr2m0000gn/T/tarmak-assets998847944
INFO[0003] initialising terraform app=tarmak module=terraform
DEBU[0004] Initializing modules... app=tarmak module=terraform std=out
DEBU[0004] - module.state app=tarmak module=terraform std=out
DEBU[0004] - module.network app=tarmak module=terraform std=out
DEBU[0004] - module.tagging_control app=tarmak module=terraform std=out
DEBU[0004] - module.bastion app=tarmak module=terraform std=out
DEBU[0004] - module.vault app=tarmak module=terraform std=out
DEBU[0004] app=tarmak module=terraform std=out
DEBU[0004] Initializing the backend... app=tarmak module=terraform std=out
DEBU[0006] app=tarmak module=terraform std=out
DEBU[0006] Initializing provider plugins... app=tarmak module=terraform std=out
DEBU[0007] app=tarmak module=terraform std=out
DEBU[0007] Terraform has been successfully initialized! app=tarmak module=terraform std=out
INFO[0007] validating terraform code app=tarmak module=terraform
INFO[0012] request new certificate from vault (plenv-plenvcluster/pki/k8s/sign/admin) app=tarmak
INFO[0015] new connection to bastion host successful app=tarmak
DEBU[0015] active channel position recieved app=tarmak cluster=hub environment=plenv module=vault
DEBU[0016] time="2019-05-16T11:19:21+01:00" level=fatal msg="unable to find an existing config, run 'tarmak init'" app=tarmak destination=api.plenv-plenvcluster.tarmak.local tunnel=api.plenv-plenvcluster.tarmak.local
WARN[0019] ssh tunnel connecting to Kubernetes API server will close after 10 minutes of inactivity: https://127.0.0.1:54266 app=tarmak
DEBU[0019] trying to connect to https://127.0.0.1:54266 app=tarmak
WARN[0019] error connecting to cluster: Get https://127.0.0.1:54266/version?timeout=32s: dial tcp 127.0.0.1:54266: connect: connection refused app=tarmak
INFO[0019] generating terraform code app=tarmak module=terraform
INFO[0020] initialising terraform app=tarmak module=terraform
DEBU[0021] Initializing modules... app=tarmak module=terraform std=out
DEBU[0021] - module.state app=tarmak module=terraform std=out
DEBU[0021] - module.network app=tarmak module=terraform std=out
DEBU[0021] - module.tagging_control app=tarmak module=terraform std=out
DEBU[0021] - module.bastion app=tarmak module=terraform std=out
DEBU[0021] - module.vault app=tarmak module=terraform std=out
DEBU[0021] app=tarmak module=terraform std=out
DEBU[0021] Initializing the backend... app=tarmak module=terraform std=out
DEBU[0022] app=tarmak module=terraform std=out
DEBU[0022] Initializing provider plugins... app=tarmak module=terraform std=out
DEBU[0023] app=tarmak module=terraform std=out
DEBU[0023] Terraform has been successfully initialized! app=tarmak module=terraform std=out
INFO[0023] validating terraform code app=tarmak module=terraform
INFO[0029] request new certificate from vault (plenv-plenvcluster/pki/k8s/sign/admin) app=tarmak
DEBU[0032] active channel position recieved app=tarmak cluster=hub environment=plenv module=vault
DEBU[0033] time="2019-05-16T11:19:37+01:00" level=fatal msg="unable to find an existing config, run 'tarmak init'" app=tarmak destination=api.plenv-plenvcluster.tarmak.local tunnel=api.plenv-plenvcluster.tarmak.local
WARN[0035] ssh tunnel connecting to Kubernetes API server will close after 10 minutes of inactivity: https://127.0.0.1:54313 app=tarmak
DEBU[0035] trying to connect to https://127.0.0.1:54313 app=tarmak
WARN[0035] error connecting to cluster: Get https://127.0.0.1:54313/version?timeout=32s: dial tcp 127.0.0.1:54313: connect: connection refused app=tarmak
INFO[0035] generating terraform code app=tarmak module=terraform
INFO[0036] initialising terraform app=tarmak module=terraform
DEBU[0037] Initializing modules... app=tarmak module=terraform std=out
DEBU[0037] - module.state app=tarmak module=terraform std=out
DEBU[0037] - module.network app=tarmak module=terraform std=out
DEBU[0037] - module.tagging_control app=tarmak module=terraform std=out
DEBU[0037] - module.bastion app=tarmak module=terraform std=out
[..]
What you expected to happen:
I would expect to use the local configuration from the env folder rather search in default ~/.tarmak directory
See:
tarmak/cmd/tarmak/cmd/tunnel.go
Line 24 in 7b521f8
tarmak/pkg/tarmak/ssh/tunnel.go
Line 235 in 7b521f8
How to reproduce it (as minimally and precisely as possible):
-Create new directory with your tarmak.yaml
file. Leave ~/.tarmak
empty.
-Do plan and apply on hub and cluster and then do tarmak -c . kubectl get pods
Anything else we need to know?:
Environment:
- Kubernetes version (use
kubectl version
): irrelevant - Cloud provider or hardware configuration**: aws
- Install tools: tarmak
- Others: