You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi! I have a question regarding the implementation of a build automation with conan2.
I have an Application that contains multiple libraries. The build automation that I am trying to build should find all
merge requests for one Issue and do the merges in every library. Afterwards it should get the build order and try to
create the packages for every library that has new merges in the build order.
In the end the build automation should rebuild the whole application with all the packages that were created in the step before. So what
I guess the build automation should create a lockfile when it creates the package for the first library that had merges.
For every subsequent library package that gets created locally this lockfile should be updated such that it also contains
the new package of the new library.
Would that be the right approach doing this with lockfiles or is there an even simpler way with the new conan2 API?
E.g. is there a way to achieve the same behaviour with using the graph API?
Have you read the CONTRIBUTING guide?
I've read the CONTRIBUTING guide
The text was updated successfully, but these errors were encountered:
The approach might depend on the level of consistency and concurrency to support.
If the concurrency is high, you might want to capture a lockfile first for the application/product, that is the most downstream consumer. Then with this lockfile, you will pass it to every merge, do the merge (or a merge commit on the fly, that is you can build things before an actual merge) and the do the conan export updating the lockfile with the new revision or new version depending on the changes. You can upload the new exported revisions/versions to a secondary "build" repo too, for example, this can be convenient.
After you have the lockfile with all the new changed revisions/versions, then you use that lockfile with your application/product, compute the build order and execute it against the "build" repo, so it finds the new revisions there too.
This orchestration is not a trivial problem, it is intrinsically complex, specially when there is concurrency in the building and upload of packages by other CI jobs and other Pull Requests. It is similarly complex in other languages.
The main problem is that we didn't have the time to properly document this, but yes, overall using lockfiles and aggregating changes against a lockfile then computing the build-order with that lockfile is the way to go.
What is your question?
Hi! I have a question regarding the implementation of a build automation with conan2.
I have an Application that contains multiple libraries. The build automation that I am trying to build should find all
merge requests for one Issue and do the merges in every library. Afterwards it should get the build order and try to
create the packages for every library that has new merges in the build order.
In the end the build automation should rebuild the whole application with all the packages that were created in the step before. So what
I guess the build automation should create a lockfile when it creates the package for the first library that had merges.
For every subsequent library package that gets created locally this lockfile should be updated such that it also contains
the new package of the new library.
Would that be the right approach doing this with lockfiles or is there an even simpler way with the new conan2 API?
E.g. is there a way to achieve the same behaviour with using the graph API?
Have you read the CONTRIBUTING guide?
The text was updated successfully, but these errors were encountered: