-
Notifications
You must be signed in to change notification settings - Fork 150
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Performance issue (out of memory) when parsing plan file that makes use of multiple modules #627
Comments
Hi Ryan, Sorry for your experience. What was the feature file you were using ? Is it possible to share that as well ? |
Yep of course
|
Yep, I can confirm it consumes around 7.96G of memory without I must say I am super surprised to this :D Looking into. |
Expectedly same happens without |
Hey @eerkunt, Do you know when there might be a fix for this issue? I have now completed a big refactoring to spit out our terraform scripts into smaller chunks however we are still experiencing high memory usage in a couple areas |
Hi @ryanbratten, we had the same issue. Using only one feature file with all scenarios solved our memory problem. |
I am sorry you are experiencing this problem guys. Life is being crazy for the last couple of months for me, unfortunately. I hope I will create some time and look into this issue asap. |
Description
It looks like terraform-compliance[faster-parsing] uses up a large amount of memory to handle a codebase using lots of terraform modules. I’ve slimmed it down to one simple rule and it's unable to parse the plan file and run the test after 53 mins of trying.
terraform-compliance exits with the following error:
To Reproduce
Large plan file using multiple modules (attached)
Running on GitHub free hosted agents, currently 7GB of RAM
Plan file:
plan.out.json.txt
Used
terraform-compliance
Parameters:none
Running via Docker:
No
Error Output:
Lots of warnings about ambiguous modules like this:
then
Expected Behavior:
Features to be executed
Tested Versions:
The text was updated successfully, but these errors were encountered: