diff --git a/ci_tutorial/packages_pipeline.rst b/ci_tutorial/packages_pipeline.rst new file mode 100644 index 00000000000..5c002afc0dc --- /dev/null +++ b/ci_tutorial/packages_pipeline.rst @@ -0,0 +1,41 @@ +Packages pipeline +================== + + +The **packages pipeline** will build, create and upload the package binaries for the different configurations and platforms, when some +developer is submitting some changes to one of the organization repositories source code. For example if a developer is doing some changes +to the ``ai`` package, improving some of the library functionality, and bumping the version to ``ai/1.1.0``. If the organization needs to +support both Windows and Linux platforms, then the package pipeline will build the new ``ai/1.1.0`` both for Windows and Linux, before +considering the changes are valid. If some of the configurations fail to build under a specific platform, it is common to consider the +changes invalid and stop the processing of those changes, until the code is fixed. + + +For the ``package pipeline`` we will start with a simple source code change in the ``ai`` recipe, simulating some improvements +in the ``ai`` package, providing some better algorithms for our game. + +✍️ **Let's do the following changes in the ai package**: + +- Let's change the implementation of the ``ai/src/ai.cpp`` function and change the message from ``Some Artificial`` to ``SUPER BETTER Artificial`` +- Let's change the default ``intelligence=0`` value in ``ai/include/ai.h`` to a new ``intelligence=50`` default. +- Finally, let's bump the version. As we did some changes to the package public headers, it would be adviced to bump the ``minor`` version, + so let`s edit the ``ai/conanfile.py`` file and define ``version = "1.1.0"`` there (instead of the previous ``1.0``). Note that if we + did some breaking changes to the ``ai`` public API, the recommendation would be to change the major instead and create a new ``2.0`` version. + + +The **packages pipeline** will take care of building the different packages binaries for the new ``ai/1.1.0`` and upload them to the ``packages`` +binary repository to avoid disrupting or causing potential issues to other developers and CI jobs. +If the pipeline succeed it will promote (copy) them to the ``products`` binary repository, and stop otherwise. + +There are different aspects that need to be taken into account when building these binary packages for ``ai/1.1.0``. The following tutorial sections do the same +job, but under different hypothesis. They are explained in increasing complexity. + +Note all of the commands can be found in the repository ``run_example.py`` file. This file is mostly intended for maintainers and testing, +but it might be useful as a reference in case of issues. + + +.. toctree:: + :maxdepth: 1 + + packages_pipeline/single_configuration + packages_pipeline/multi_configuration + packages_pipeline/multi_configuration_lockfile diff --git a/ci_tutorial/packages_pipeline/multi_configuration.rst b/ci_tutorial/packages_pipeline/multi_configuration.rst new file mode 100644 index 00000000000..c6f203ce3c6 --- /dev/null +++ b/ci_tutorial/packages_pipeline/multi_configuration.rst @@ -0,0 +1,195 @@ +Package pipeline: multi configuration +===================================== + +In the previous section we were building just 1 configuration. This section will cover the case in which we need to build more +than 1 configuration. We will use the ``Release`` and ``Debug`` configurations here for convenience, as it is easier to +follow, but in real case these configurations will be more like Windows, Linux, OSX, building for different architectures, +cross building, etc. + +Let's begin cleaning our cache and initializing only the ``develop`` repo: + + +.. code-block:: bash + + $ conan remove "*" -c # Make sure no packages from last run + $ conan remote remove "*" # Make sure no other remotes defined + # Add develop repo, you might need to adjust this URL + $ conan remote add develop http://localhost:8081/artifactory/api/conan/develop + + +We will create the packages for the 2 configurations sequentially in our computer, but note these will typically run +in different computers, so it is typical for CI systems to launch the builds of different configurations in parallel. + +.. code-block:: bash + :caption: Release build + + $ cd ai + $ conan create . --build="missing:ai/*" -s build_type=Release --format=json > graph.json + $ conan list --graph=graph.json --graph-binaries=build --format=json > built.json + # Add packages repo, you might need to adjust this URL + $ conan remote add packages http://localhost:8081/artifactory/api/conan/packages + $ conan upload -l=built.json -r=packages -c --format=json > uploaded_release.json + +We have done a few changes and extra steps: + +- First step is similar to the one in the previous section, a ``conan create``, just making it explicit our configuration + ``-s build_type=Release`` for clarity, and capturing the output of the ``conan create`` in a ``graph.json`` file. +- The second step is create from the ``graph.json`` a ``built.json`` **package list** file, with the packages that needs to be uploaded, + in this case, only the packages that have been built from source (``--graph-binaries=build``) will be uploaded. This is + done for efficiency and faster uploads. +- Third step is to define the ``packages`` repository +- Finally, we will upload the ``built.json`` package list to the ``packages`` repository, creating the ``uploaded_release.json`` + package list with the new location of the packages (the server repository). + +Likewise, the Debug build will do the same steps: + + +.. code-block:: bash + :caption: Debug build + + $ conan create . --build="missing:ai/*" -s build_type=Debug --format=json > graph.json + $ conan list --graph=graph.json --graph-binaries=build --format=json > built.json + # Remote definition can be ommitted in tutorial, it was defined above (-f == force) + $ conan remote add packages http://localhost:8081/artifactory/api/conan/packages -f + $ conan upload -l=built.json -r=packages -c --format=json > uploaded_debug.json + + +When both Release and Debug configuration finish successfully, we would have these packages in the repositories: + +.. graphviz:: + :align: center + + digraph repositories { + node [fillcolor="lightskyblue", style=filled, shape=box] + rankdir="LR"; + subgraph cluster_0 { + label="Packages server"; + style=filled; + color=lightgrey; + subgraph cluster_1 { + label = "packages\n repository" + shape = "box"; + style=filled; + color=lightblue; + "packages" [style=invis]; + "ai/1.1.0\n (Release)"; + "ai/1.1.0\n (Debug)"; + } + subgraph cluster_2 { + label = "products\n repository" + shape = "box"; + style=filled; + color=lightblue; + "products" [style=invis]; + } + subgraph cluster_3 { + rankdir="BT"; + shape = "box"; + label = "develop repository"; + color=lightblue; + rankdir="BT"; + + node [fillcolor="lightskyblue", style=filled, shape=box] + "game/1.0" -> "engine/1.0" -> "ai/1.0" -> "mathlib/1.0"; + "engine/1.0" -> "graphics/1.0" -> "mathlib/1.0"; + "mapviewer/1.0" -> "graphics/1.0"; + "game/1.0" [fillcolor="lightgreen"]; + "mapviewer/1.0" [fillcolor="lightgreen"]; + } + { + edge[style=invis]; + "packages" -> "products" -> "game/1.0" ; + rankdir="BT"; + } + } + } + + +When all the different binaries for ``ai/1.1.0`` have been built correctly, the ``package pipeline`` can consider its job succesfull and decide +to promote those binaries. But further package builds and checks are necessary, so instead of promoting them to the ``develop`` repository, +the ``package pipeline`` can promote them to the ``products`` binary repository. As all other developers and CI use the ``develop`` repository, +no one will be broken at this stage either: + +.. code-block:: bash + :caption: Promoting from packages->product + + # aggregate the package list + $ conan pkglist merge -l uploaded_release.json -l uploaded_debug.json --format=json > uploaded.json + + # Promotion using Conan download/upload commands + # (slow, can be improved with art:promote custom command) + $ conan download --list=uploaded.json -r=packages --format=json > promote.json + $ conan upload --list=promote.json -r=products -c + + +The first step uses the ``conan pkglist merge`` command to merge the package lists from the "Release" and "Debug" configurations and +merge it into a single ``uploaded.json`` package list. +This list is the one that will be used to run the promotion. + +In this example we are using a slow ``conan download`` + ``conan upload`` promotion. This can be way more efficient with +the ``conan art:promote`` extension command. + +After running the promotion we will have the following packages in the server: + +.. graphviz:: + :align: center + + digraph repositories { + node [fillcolor="lightskyblue", style=filled, shape=box] + rankdir="LR"; + subgraph cluster_0 { + label="Packages server"; + style=filled; + color=lightgrey; + subgraph cluster_1 { + label = "packages\n repository" + shape = "box"; + style=filled; + color=lightblue; + "packages" [style=invis]; + "ai/1.1.0\n (Release)"; + "ai/1.1.0\n (Debug)"; + } + subgraph cluster_2 { + label = "products\n repository" + shape = "box"; + style=filled; + color=lightblue; + "products" [style=invis]; + "ai/promoted release" [label="ai/1.1.0\n (Release)"]; + "ai/promoted debug" [label="ai/1.1.0\n (Debug)"]; + } + subgraph cluster_3 { + rankdir="BT"; + shape = "box"; + label = "develop repository"; + color=lightblue; + rankdir="BT"; + + node [fillcolor="lightskyblue", style=filled, shape=box] + "game/1.0" -> "engine/1.0" -> "ai/1.0" -> "mathlib/1.0"; + "engine/1.0" -> "graphics/1.0" -> "mathlib/1.0"; + "mapviewer/1.0" -> "graphics/1.0"; + "game/1.0" [fillcolor="lightgreen"]; + "mapviewer/1.0" [fillcolor="lightgreen"]; + } + { + edge[style=invis]; + "packages" -> "products" -> "game/1.0" ; + rankdir="BT"; + } + } + } + + +To summarize: + +- We built 2 different configurations, ``Release`` and ``Debug`` (could have been Windows/Linux or others), and uploaded them + to the ``packages`` repository. +- When all package binaries for all configurations were successfully built, we promoted them from the ``packages`` to the + ``products`` repository, to make them available for the ``products pipeline``. +- **Package lists** were captured in the package creation process and merged into a single one to run the promotion. + + +There is still an aspect that we haven't considered yet, the possibility that the dependencies of ``ai/1.1.0`` change +during the build. Move to the next section to see how to use lockfiles to achieve more consistent multi-configuration builds. diff --git a/ci_tutorial/packages_pipeline/multi_configuration_lockfile.rst b/ci_tutorial/packages_pipeline/multi_configuration_lockfile.rst new file mode 100644 index 00000000000..ff6032e8897 --- /dev/null +++ b/ci_tutorial/packages_pipeline/multi_configuration_lockfile.rst @@ -0,0 +1,147 @@ +Package pipeline: multi configuration using lockfiles +===================================================== + +In the previous example, we built both ``Debug`` and ``Release`` package binaries for ``ai/1.1.0``. In real world scenarios the binaries to build would be different platforms (Windows, Linux, embedded), different architectures, and very often it will not be possible to build them in the same machine, requiring different computers. + +The previous example had an important assumption: the dependencies of ``ai/1.1.0`` do not change at all during the building process. In many scenarios, this assumption will not hold, for example if there are any other concurrent CI jobs, and one succesfull job publishes a new ``mathlib/1.1`` version in the ``develop`` repo. + +Then it is possible that one build of ``ai/1.1.0``, for example, the one running in the Linux servers starts earlier and uses the previous ``mathlib/1.0`` version as dependency, while the Windows servers start a bit later, and then their build will use the recent ``mathlib/1.1`` version as dependency. This is a very undesirable situation, having binaries for the same ``ai/1.1.0`` version using different dependencies versions. This can lead in later graph resolution problems, or even worse, get to the release with different behavior for different platforms. + +The way to avoid this discrepancy in dependencies is to force the usage of the same dependencies versions and revisions, something that can be done with :ref:`lockfiles`. + +Creating and applying lockfiles is relatively straightforward. The process of creating and promoting the configurations will be identical to the previous section, but just applying the lockfiles. + +Creating the lockfile +--------------------- + +Let's make sure as usual that we start from a clean state: + +.. code-block:: bash + + $ conan remove "*" -c # Make sure no packages from last run + $ conan remote remove "*" # Make sure no other remotes defined + # Add develop repo, you might need to adjust this URL + $ conan remote add develop http://localhost:8081/artifactory/api/conan/develop + + +Then we can create the lockfile ``conan.lock`` file: + +.. code-block:: bash + + # Capture a lockfile for the Release configuration + $ conan lock create . -s build_type=Release --lockfile-out=conan.lock + # extend the lockfile so it also covers the Debug configuration + # in case there are Debug-specific dependencies + $ conan lock create . -s build_type=Debug --lockfile=conan.lock --lockfile-out=conan.lock + +Note that different configurations, using different profiles or settings could result in different dependency graphs. A lockfile file can be used to lock the different configurations, but it is important to iterate the different configurations/profiles and capture their information in the lockfile. + +.. note:: + + The ``conan.lock`` is the default argument, and if a ``conan.lock`` file exists, it might be automatically used by ``conan install/create`` and other graph commands. This can simplify many of the commands, but this tutorial is showing the full explicit commands for clarity and didactical reasons. + +The ``conan.lock`` file can be inspected, it will be something like: + +.. code-block:: json + + { + "version": "0.5", + "requires": [ + "mathlib/1.0#f2b05681ed843bf50d8b7b7bdb5163ea%1724319985.398" + ], + "build_requires": [], + "python_requires": [], + "config_requires": [] + } + +As we can see, it is locking the ``mathlib/1.0`` dependency version and revision. + + +With the lockfile, creating the different configurations is exactly the same, but providing the ``--lockfile=conan.lock`` argument to the ``conan create`` step, it will guarantee that ``mathlib/1.0#f2b05681ed843bf50d8b7b7bdb5163ea`` will always be the exact dependency used, irrespective if there exist new ``mathlib/1.1`` versions or new revisions available. The following builds could be launched in parallel but executed at different times, and still they will always use the same ``mathlib/1.0`` dependency: + + +.. code-block:: bash + :caption: Release build + + $ cd ai + $ conan create . --build="missing:ai/*" --lockfile=conan.lock -s build_type=Release --format=json > graph.json + $ conan list --graph=graph.json --graph-binaries=build --format=json > built.json + # Add packages repo, you might need to adjust this URL + $ conan remote add packages http://localhost:8081/artifactory/api/conan/packages + $ conan upload -l=built.json -r=packages -c --format=json > uploaded_release.json + +.. code-block:: bash + :caption: Debug build + + $ conan create . --build="missing:ai/*" --lockfile=conan.lock -s build_type=Debug --format=json > graph.json + $ conan list --graph=graph.json --graph-binaries=build --format=json > built.json + # Remote definition can be ommitted in tutorial, it was defined above (-f == force) + $ conan remote add packages http://localhost:8081/artifactory/api/conan/packages -f + $ conan upload -l=built.json -r=packages -c --format=json > uploaded_debug.json + +Note the only modification to the previous example is the addition of ``--lockfile=conan.lock``. The promotion will also be identical to the previous one: + +.. code-block:: bash + :caption: Promoting from packages->product + + # aggregate the package list + $ conan pkglist merge -l uploaded_release.json -l uploaded_debug.json --format=json > uploaded.json + + # Promotion using Conan download/upload commands + # (slow, can be improved with art:promote custom command) + $ conan download --list=uploaded.json -r=packages --format=json > promote.json + $ conan upload --list=promote.json -r=products -c + +And the final result will be the same as in the previous section, but this time just with the guarantee that both ``Debug`` and ``Release`` binaries were built using exactly the same ``mathlib`` version: + +.. graphviz:: + :align: center + + digraph repositories { + node [fillcolor="lightskyblue", style=filled, shape=box] + rankdir="LR"; + subgraph cluster_0 { + label="Packages server"; + style=filled; + color=lightgrey; + subgraph cluster_1 { + label = "packages\n repository" + shape = "box"; + style=filled; + color=lightblue; + "packages" [style=invis]; + "ai/1.1.0\n (Release)"; + "ai/1.1.0\n (Debug)"; + } + subgraph cluster_2 { + label = "products\n repository" + shape = "box"; + style=filled; + color=lightblue; + "products" [style=invis]; + "ai/promoted release" [label="ai/1.1.0\n (Release)"]; + "ai/promoted debug" [label="ai/1.1.0\n (Debug)"]; + } + subgraph cluster_3 { + rankdir="BT"; + shape = "box"; + label = "develop repository"; + color=lightblue; + rankdir="BT"; + + node [fillcolor="lightskyblue", style=filled, shape=box] + "game/1.0" -> "engine/1.0" -> "ai/1.0" -> "mathlib/1.0"; + "engine/1.0" -> "graphics/1.0" -> "mathlib/1.0"; + "mapviewer/1.0" -> "graphics/1.0"; + "game/1.0" [fillcolor="lightgreen"]; + "mapviewer/1.0" [fillcolor="lightgreen"]; + } + { + edge[style=invis]; + "packages" -> "products" -> "game/1.0" ; + rankdir="BT"; + } + } + } + +Now that we have the new ``ai/1.1.0`` binaries in the ``products`` repo, we can consider the ``packages pipeline`` finished and move to the next section, and build and check our products to see if this new ``ai/1.1.0`` version integrates correctly. diff --git a/ci_tutorial/packages_pipeline/single_configuration.rst b/ci_tutorial/packages_pipeline/single_configuration.rst new file mode 100644 index 00000000000..250c9ca265e --- /dev/null +++ b/ci_tutorial/packages_pipeline/single_configuration.rst @@ -0,0 +1,102 @@ +Package pipeline: single configuration +====================================== + +We will start with the most simple case, in which we only had to build 1 configuration, and that configuration +can be built in the current CI machine. + +As we described before while presenting the different server binary repositories, the idea is that package builds +will use by default the ``develop`` repo only, which is considered the stable one for developer and CI jobs. + +Let's make sure we start from a clean state: + +.. code-block:: bash + + $ conan remove "*" -c # Make sure no packages from last run + $ conan remote remove "*" # Make sure no other remotes defined + # Add only the develop repo, you might need to adjust this for your URL + $ conan remote add develop http://localhost:8081/artifactory/api/conan/develop + + +The removal and addition of repos accross this tutorial can be a bit tedious, but it is important for the correct +behavior. Also, there might be other configurations that can be even more efficient for some cases, like re-triggering +a broken job because of CI malfunction, but we will keep it simple at the moment and try to focus on the main concepts. + +With this configuration the CI job could just do: + +.. code-block:: bash + + $ cd ai + $ conan create . --build="missing:ai/*" + ... + ai/1.1.0: SUPER BETTER Artificial Intelligence for aliens (Release)! + ai/1.1.0: Intelligence level=50 + + +Note the ``--build="missing:ai/*"`` might not be fully necessary in some cases, but it can save time in other situations. +For example, if the developer did some changes just to the repo README, and didn't bump the version at all, Conan will not +generate a new ``recipe revision``, and detect this as a no-op, avoiding having to unnecessarily rebuild binaries from source. + +If we are in a single-configuration scenario and it built correctly, for this simple case we don't need a promotion, +and just uploading directly the built packages to the ``products`` repository will be enough, where the ``products pipeline`` +will pick it later. + + +.. code-block:: bash + + # We don't want to disrupt developers or CI, upload to products + # Add products repo, you might need to adjust this URL + $ conan remote add products http://localhost:8081/artifactory/api/conan/products + $ conan upload "ai*" -r=products -c + +As the cache was initially clean, all ``ai`` packages would be the ones that were built in this pipeline. + + +.. graphviz:: + :align: center + + digraph repositories { + node [fillcolor="lightskyblue", style=filled, shape=box] + rankdir="LR"; + subgraph cluster_0 { + label="Packages server"; + style=filled; + color=lightgrey; + subgraph cluster_1 { + label = "packages\n repository" + shape = "box"; + style=filled; + color=lightblue; + "packages" [style=invis]; + } + subgraph cluster_2 { + label = "products\n repository" + shape = "box"; + style=filled; + color=lightblue; + "products" [style=invis]; + "ai/1.1.0\n (single config)"; + } + subgraph cluster_3 { + rankdir="BT"; + shape = "box"; + label = "develop repository"; + color=lightblue; + rankdir="BT"; + + node [fillcolor="lightskyblue", style=filled, shape=box] + "game/1.0" -> "engine/1.0" -> "ai/1.0" -> "mathlib/1.0"; + "engine/1.0" -> "graphics/1.0" -> "mathlib/1.0"; + "mapviewer/1.0" -> "graphics/1.0"; + "game/1.0" [fillcolor="lightgreen"]; + "mapviewer/1.0" [fillcolor="lightgreen"]; + } + { + edge[style=invis]; + "packages" -> "products" -> "game/1.0" ; + rankdir="BT"; + } + } + } + + +This was a very simple scenario, let's move to a more realistic one: having to build more than one configuration. diff --git a/ci_tutorial/products_pipeline.rst b/ci_tutorial/products_pipeline.rst new file mode 100644 index 00000000000..1c5a7287cbf --- /dev/null +++ b/ci_tutorial/products_pipeline.rst @@ -0,0 +1,103 @@ +Products pipeline +================== + +The **products pipeline** responds to a more challenging question: do my "products" build correctly with the new versions of the packages? +to the packages and their dependencies? This is the real "Continuous Integration" part, in which changes in different packages are really tested against the organization +important products to check if things integrate cleanly or break. + +Let's continue with the example above, if we now have a new ``ai/1.1.0`` package, +is it going to break the existing ``game/1.0`` and/or ``mapviewer/1.0`` applications? Is it necessary to re-build from source some of the existing +packages that depend directly or indirectly on ``ai`` package? In this tutorial we use ``game/1.0`` and ``mapviewer/1.0`` as our "products", +but this concept will be further explained later, and specially why it is important to think in terms of "products" instead of trying to explicitly +model the dependencies top-bottom in the CI. + +The essence of this **products pipeline** in our example is that the new ``ai/1.1.0`` version that was uploaded to the ``products`` repository +automatically falls into the valid version ranges, and our versioning approach means that such a minor version increase will require building from +source its consumers, in this case ``engine/1.0`` and ``game/1.0`` and in that specific sequential order, while all the other packages will remain the same. +Knowing which packages need to be built from source and in which order, and executing that build to check if the main organization products keep +working correctly with the new dependencies versions is the responsibility of the products pipeline. + + +What are the **products** +------------------------- + +The **products** are the main software artifact that a organization (a company, a team, a project) is delivering as final result and provide some +value for users of those artifacts. In this example we will consider ``game/1.0`` and ``mapviewer/1.0`` the "products". Note that it is +possible to define different versions of the same package as products, for example, if we had to maintain different versions of the ``game`` for +different customers, we could have ``game/1.0`` and ``game/2.3`` as well as different versions of ``mapviewer`` as products. + +The "products" approach, besides the advantage of focusing on the business value, has another very important advantage: it avoids having to model +the dependency graph at the CI layer. It is a frequent attempt trying to model the inverse dependency model, that is, representing at the CI level +the dependants or consumers of a given package. In our example, if we had configured a job for building the ``ai`` package, we could have another +job for the ``engine`` package, that is triggered after the ``ai`` one, configuring such topology somehow in the CI system. + +But this approach does not scale at all and have very important limitations: + +- The example above is relatively simple, but in practice dependency graphs can have many more packages, even several hundreds, making it very tedious and error prone to define all dependencies among packages in the CI +- Dependencies evolve over time, and new versions are used, some dependencies are removed and newer dependencies are added. The simple relationship between repositories modeled at the CI level can result in a very inefficient, slow and time consuming CI, if not a fragile one that continuously breaks because some dependencies change. +- The combinatorial nature that happens downstream a dependency graph, where a relatively stable top dependency, lets say ``mathlib/1.0`` might be used by multiple consumers such as ``ai/1.0``, ``ai/1.1``, ``ai/1.2`` which in turn each one might be used by multiple ``engine`` different versions and so on. Building only the latest version of the consumers would be insufficient in many cases and building all of them would be extremely costly. +- The "inverse" dependency model, that is, asking what are the "dependants" of a given package is extremely challeging in practice, specially in a decentralized + approach like Conan in which packages can be stored in different repositories, including different servers, and there isn't a central database of all packages and their relations. + Also, the "inverse" dependency model is, similar to the direct one, conditional. As a dependency can be conditional on any configuration (settings, options), the inverse is + also conditioned to the same logic, and such logic also evolves and changes with every new revision and version. + +In C and C++ projects the "products" pipeline becomes more necessary and critical than in other languages due to the compilation model with headers textual inclusions becoming part of the consumers' binary artifacts and due to the native artifacts linkage models. + + +Building intermediate packages new binaries +------------------------------------------- + +A frequently asked question is what would be the version of a consumer package when it builds against a new dependency version. +Put it explicitly for our example, where we have defined that we need to build again the ``engine/1.0`` package because now it is +depending on ``ai/1.1.0`` new version: + +- Should we create a new ``engine/1.1`` version to build against the new ``ai/1.1.0``? +- Or should we keep the ``engine/1.0`` version? + +The answer lies in the :ref:`binary model and how dependencies affect the package_id`. +Conan has a binary model that takes into account both the versions, revisions and ``package_id`` of the dependencies, as well +as the different package types (``package_type`` attribute). + +The recommendation is to keep the package versions aligned with the source code. If ``engine/1.0`` is building from a specific +commit/tag of its source repository, and the source of that repository doesn't change at all, then it becomes very confusing to +have a changing package version that deviate from the source version. With the Conan binary model what we will have is 2 +different binaries for ``engine/1.0``, with 2 different ``package_id``. One binary will be built against the ``ai/1.0`` version +and the other binary will be built against the ``ai/1.1.0``, something like: + +.. code-block:: + :emphasize-lines: 6, 12, 14, 20 + + $ conan list engine:* -r=develop + engine/1.0 + revisions + fba6659c9dd04a4bbdc7a375f22143cb (2024-08-22 09:46:24 UTC) + packages + 2c5842e5aa3ed21b74ed7d8a0a637eb89068916e + info + settings + ... + requires + ai/1.0.Z + graphics/1.0.Z + mathlib/1.0.Z + de738ff5d09f0359b81da17c58256c619814a765 + info + settings + ... + requires + ai/1.1.Z + graphics/1.0.Z + mathlib/1.0.Z + + +Let's see how a product pipeline can build such ``engine/1.0`` and ``game/1.0`` new binaries using the new dependencies versions. +In the following sections we will present a products pipeline in an incremental way, the same as the packages pipeline. + + +.. toctree:: + :maxdepth: 1 + + products_pipeline/single_configuration + products_pipeline/distributed_build + products_pipeline/multi_product + products_pipeline/full_pipeline diff --git a/ci_tutorial/products_pipeline/build_order_simple.png b/ci_tutorial/products_pipeline/build_order_simple.png new file mode 100644 index 00000000000..22db6879599 Binary files /dev/null and b/ci_tutorial/products_pipeline/build_order_simple.png differ diff --git a/ci_tutorial/products_pipeline/distributed_build.rst b/ci_tutorial/products_pipeline/distributed_build.rst new file mode 100644 index 00000000000..f7de64f9959 --- /dev/null +++ b/ci_tutorial/products_pipeline/distributed_build.rst @@ -0,0 +1,113 @@ +Products pipeline: distributed build +==================================== + + +The previous section used ``--build=missing`` to build all the necessary packages in the same CI machine. +This is not always desired, or even possible, and in many situations it is preferable to do a distributed build, to achieve faster builds and better usage the CI resources. The most natural distribution of the build load is to build different packages in different machines. Let's see how this is possible with the ``conan graph build-order`` command. + +Let's start as usual making sure we have a clean environment with the right repositories defined: + +.. code-block:: bash + + # First clean the local "build" folder + $ pwd # should be /examples2/ci/game + $ rm -rf build # clean the temporary build folder + $ mkdir build && cd build # To put temporary files + + # Now clean packages and define remotes + $ conan remove "*" -c # Make sure no packages from last run + + # If you did in previous sections, NO need to repeat this + $ conan remote remove "*" # Make sure no other remotes defined + # Add products repo, you might need to adjust this URL + # NOTE: The products repo is added first, it will have higher priority. + $ conan remote add products http://localhost:8081/artifactory/api/conan/products + # Add develop repo, you might need to adjust this URL + $ conan remote add develop http://localhost:8081/artifactory/api/conan/develop + + +We will obviate by now the ``mapviewer/1.0`` product and focus this section in the ``game/1.0`` product. +The first step is to compute the "build-order", that is, the list of packages that need to be built, and in what order. +This is done with the following ``conan graph build-order`` command: + +.. code-block:: bash + + $ conan graph build-order --requires=game/1.0 --build=missing + --order-by=recipe --reduce --format=json > game_build_order.json + +Note a few important points: + +- It is necessary to use the ``--build=missing``, in exactly the same way than in the previous section. Failing to provide the intended ``--build`` policy and argument will result in incomplete or erroneous build-orders. +- The ``--reduce`` argument eliminates all elements in the resulting order that don't have the ``binary: Build`` policy. This means that the resulting "build-order" cannot be merged with other build order files for aggregating them into a single one, which is important when there are multiple configurations and products. +- The ``--order-by`` argument allows to define different orders, by "recipe" or by "configuration". In this case, we are using ``--order-by=recipe`` which is intended to parallelize builds per recipe, that means, that all possible different binaries for a given package like ``engine/1.0`` should be built first before any consumer of ``engine/1.0`` can be built. + +The resulting ``game_build_order.json`` looks like: + +.. code-block:: json + :caption: game_build_order.json + + { + "order_by": "recipe", + "reduced": true, + "order": [ + [ + { + "ref": "engine/1.0#fba6659c9dd04a4bbdc7a375f22143cb", + "packages": [ + [ + { + "package_id": "de738ff5d09f0359b81da17c58256c619814a765", + "binary": "Build", + "build_args": "--requires=engine/1.0 --build=engine/1.0", + } + ] + ] + } + ], + [ + { + "ref": "game/1.0#1715574045610faa2705017c71d0000e", + "depends": [ + "engine/1.0#fba6659c9dd04a4bbdc7a375f22143cb" + ], + "packages": [ + [ + { + "package_id": "bac7cd2fe1592075ddc715563984bbe000059d4c", + "binary": "Build", + "build_args": "--requires=game/1.0 --build=game/1.0", + } + ] + ] + } + ] + ] + } + + +For convenience, in the same way that ``conan graph info ... --format=html > graph.html`` can generate a file with an HTML interactive dependency graph, the ``conan graph build-order ... --format=html > build_order.html`` can generate an HTML visual representation of the above json file: + + +.. image:: ./build_order_simple.png + :width: 500 px + :align: center + + +The resulting json contains an ``order`` element which is a list of lists. This arrangement is important, every element in the top list is a set of packages that can be built in parallel because they do not have any relationship among them. You can view this list as a list of "levels", in level 0, there are packages that have no dependencies to any other package being built, in level 1 there are packages that contain dependencies only to elements in level 0 and so on. + +Then, the order of the elements in the outermost list is important and must be respected. Until the build of all the packages in one list item has finished, it is not possible to start the build of the next "level". + +Using the information in the ``graph_build_order.json`` file, it is possible to execute the build of the necessary packages, in the same way that the previous section's ``--build=missing`` did, but not directly managed by us. + +Taking the arguments from the json, the commands to execute would be: + +.. code-block:: bash + + $ conan install --requires=engine/1.0 --build=engine/1.0 + $ conan install --requires=game/1.0 --build=game/1.0 + +We are executing these commands manually, but in practice, it would be a ``for`` loop in CI executing over the json output. We will see some Python code later for this. At this point we wanted to focus on the ``conan graph build-order`` command, but we haven't really explained how the build is distributed. + +Also note that inside every element there is an inner list of lists, the ``"packages"`` section, for all the binaries that must be built for a specific recipe for different configurations. + +Let's move now to see how a multi-product, multi-configuration build order can be computed. diff --git a/ci_tutorial/products_pipeline/full_pipeline.rst b/ci_tutorial/products_pipeline/full_pipeline.rst new file mode 100644 index 00000000000..6dcb5a7bde9 --- /dev/null +++ b/ci_tutorial/products_pipeline/full_pipeline.rst @@ -0,0 +1,303 @@ +Products pipeline: distributed full pipeline with lockfiles +=========================================================== + +This section will present the full and complete implementation of a multi-product, multi-configuration +distributed CI pipeline. It will cover important implementation details: + +- Using lockfiles to guarantee a consistent and fixed set of dependencies for all configurations. +- Uploading built packages to the ``products`` repository. +- Capturing "package lists" and using them to run the final promotion. +- How to iterate the "build-order" programmatically + + +Let's start as usual cleaning the local cache and defining the correct repos: + +.. code-block:: bash + + # First clean the local "build" folder + $ pwd # should be /examples2/ci/game + $ rm -rf build # clean the temporary build folder + $ mkdir build && cd build # To put temporary files + + # Now clean packages and define remotes + $ conan remove "*" -c # Make sure no packages from last run + + # If you did in previous sections, NO need to repeat this + $ conan remote remove "*" # Make sure no other remotes defined + # Add products repo, you might need to adjust this URL + # NOTE: The products repo is added first, it will have higher priority. + $ conan remote add products http://localhost:8081/artifactory/api/conan/products + # Add develop repo, you might need to adjust this URL + $ conan remote add develop http://localhost:8081/artifactory/api/conan/develop + + +Similarly to what we did in the ``packages pipeline`` when we wanted to ensure that the dependencies are exactly the same when building the different configurations and products, the first necessary step is to compute a ``conan.lock`` lockfile that we can pass to the different CI build agents to enforce the same set of dependencies everywhere. This can be done incrementally for the different ``products`` and configurations, aggregating it in the final single ``conan.lock`` lockfile. This approach assumes that both ``game/1.0`` and ``mapviewer/1.0`` will be using the same versions and revisions of the common dependencies. + +.. code-block:: bash + + $ conan lock create --requires=game/1.0 --lockfile-out=conan.lock + $ conan lock create --requires=game/1.0 -s build_type=Debug + --lockfile=conan.lock --lockfile-out=conan.lock + $ conan lock create --requires=mapviewer/1.0 --lockfile=conan.lock + --lockfile-out=conan.lock + $ conan lock create --requires=mapviewer/1.0 -s build_type=Debug + --lockfile=conan.lock --lockfile-out=conan.lock + + +.. note:: + + Recall that the ``conan.lock`` arguments are mostly optional, as that is the default lockfile name. + The first command can be typed as ``conan lock create --requires=game/1.0``. Also, all commands, including + ``conan install``, if they find a existing ``conan.lock`` file they will use it automatically, without an + explicit ``--lockfile=conan.lock``. The commands in this tutorial are shown explicitly complete for + completeness and didactical reasons. + + +Then, we can compute the build order for each product and configuration. These commands are identical to the ones in the +previous section, with the only difference of adding a ``--lockfile=conan.lock`` argument: + + +.. code-block:: bash + + $ conan graph build-order --requires=game/1.0 --lockfile=conan.lock + --build=missing --order-by=recipe --format=json > game_release.json + $ conan graph build-order --requires=game/1.0 --lockfile=conan.lock + --build=missing -s build_type=Debug --order-by=recipe --format=json > game_debug.json + $ conan graph build-order --requires=mapviewer/1.0 --lockfile=conan.lock + --build=missing --order-by=recipe --format=json > mapviewer_release.json + $ conan graph build-order --requires=mapviewer/1.0 --lockfile=conan.lock + --build=missing -s build_type=Debug --order-by=recipe --format=json > mapviewer_debug.json + +Likewise the ``build-order-merge`` command will be identical to the previous one. +In this case, as this command doesn't really compute a dependency graph, a ``conan.lock`` argument is not necessary, +dependencies are not being resolved: + + +.. code-block:: bash + + $ conan graph build-order-merge + --file=game_release.json --file=game_debug.json + --file=mapviewer_release.json --file=mapviewer_debug.json + --reduce --format=json > build_order.json + + + +So far, this process has been almost identical to the previous section one, just with the difference of capturing and using a lockfile. +Now, we will explain the "core" of the ``products`` pipeline: iterating the build-order and distributing the build, and gathering the +resulting built packages. + +This would be an example of some Python code that performs the iteration sequentially (a real CI system would distribute the builds to different agents in parallel): + + +.. code-block:: python + + build_order = open("build_order.json", "r").read() + build_order = json.loads(build_order) + to_build = build_order["order"] + + pkg_lists = [] # to aggregate the uploaded package-lists + for level in to_build: + for recipe in level: # This could be executed in parallel + ref = recipe["ref"] + # For every ref, multiple binary packages are being built. + # This can be done in parallel too. Often it is for different platforms + # they will need to be distributed to different build agents + for packages_level in recipe["packages"]: + # This could be executed in parallel too + for package in packages_level: + build_args = package["build_args"] + filenames = package["filenames"] + build_type = "-s build_type=Debug" if any("debug" in f for f in filenames) else "" + run(f"conan install {build_args} {build_type} --lockfile=conan.lock --format=json", file_stdout="graph.json") + run("conan list --graph=graph.json --format=json", file_stdout="built.json") + filename = f"uploaded{len(pkg_lists)}.json" + run(f"conan upload -l=built.json -r=products -c --format=json", file_stdout=filename) + pkg_lists.append(filename) + + +.. note:: + + - This code is specific for the ``--order-by=recipe`` build-order, if chosing the ``--order-by=configuration``, the json + is different and it would require a different iteration. + + +These are the tasks that the above Python code is doing: + +- For every ``package`` in the build-order, a ``conan install --require= --build=`` is issued, and the result of this command is stored in a ``graph.json`` file +- The ``conan list`` command transform this ``graph.json`` into a package list called ``built.json``. Note that this package list actually stores both the built packages and the necessary transitive dependencies. This is done for simplicity, as later these package lists will be used for running a promotion, and we also want to promote the dependencies such as ``ai/1.1.0`` that were built in the ``packages pipeline`` and not by this job. +- The ``conan upload`` command uploads the package list to the ``products`` repo. Note that the ``upload`` first checks what packages already exist in the repo, avoiding costly transfers if they already exist. +- The result of the ``conan upload`` command is captured in a new package list called ``uploaded.json``, that we will accumulate later, that will serve for the final promotion. + + +In practice this translates to the following commands (that you can execute to continue the tutorial): + +.. code-block:: bash + + # engine/1.0 release + $ conan install --requires=engine/1.0 --build=engine/1.0 --lockfile=conan.lock + --format=json > graph.json + $ conan list --graph=graph.json --format=json > built.json + $ conan upload -l=built.json -r=products -c --format=json > uploaded1.json + + # engine/1.0 debug + $ conan install --requires=engine/1.0 --build=engine/1.0 --lockfile=conan.lock + -s build_type=Debug --format=json > graph.json + $ conan list --graph=graph.json --format=json > built.json + $ conan upload -l=built.json -r=products -c --format=json > uploaded2.json + + # game/1.0 release + $ conan install --requires=game/1.0 --build=game/1.0 --lockfile=conan.lock + --format=json > graph.json + $ conan list --graph=graph.json --format=json > built.json + $ conan upload -l=built.json -r=products -c --format=json > uploaded3.json + + # game/1.0 debug + $ conan install --requires=game/1.0 --build=game/1.0 --lockfile=conan.lock + -s build_type=Debug --format=json > graph.json + $ conan list --graph=graph.json --format=json > built.json + $ conan upload -l=built.json -r=products -c --format=json > uploaded4.json + + +After this step the newly built packages will be in the ``products`` repo and we will have 4 ``uploaded1.json`` - ``uploaded4.json`` files. + +Simplifying the different release and debug configurations, the state of our repositories would be something like: + + +.. graphviz:: + :align: center + + digraph repositories { + node [fillcolor="lightskyblue", style=filled, shape=box] + rankdir="LR"; + subgraph cluster_0 { + label="Packages server"; + style=filled; + color=lightgrey; + subgraph cluster_1 { + label = "packages\n repository" + shape = "box"; + style=filled; + color=lightblue; + "packages" [style=invis]; + "ai/1.1.0\n (Release)"; + "ai/1.1.0\n (Debug)"; + } + subgraph cluster_2 { + label = "products\n repository" + shape = "box"; + style=filled; + color=lightblue; + "products" [style=invis]; + "ai/promoted" [label="ai/1.1.0\n(new version)"]; + "engine/promoted" [label="engine/1.0\n(new binary)"]; + "game/promoted" [label="game/1.0\n(new binary)", fillcolor="lightgreen"]; + + + node [fillcolor="lightskyblue", style=filled, shape=box] + "game/promoted" -> "engine/promoted" -> "ai/promoted"; + } + subgraph cluster_3 { + rankdir="BT"; + shape = "box"; + label = "develop repository"; + color=lightblue; + rankdir="BT"; + + node [fillcolor="lightskyblue", style=filled, shape=box] + "game/1.0" -> "engine/1.0" -> "ai/1.0" -> "mathlib/1.0"; + "engine/1.0" -> "graphics/1.0" -> "mathlib/1.0"; + "mapviewer/1.0" -> "graphics/1.0"; + "game/1.0" [fillcolor="lightgreen"]; + "mapviewer/1.0" [fillcolor="lightgreen"]; + } + { + edge[style=invis]; + "packages" -> "products" -> "game/1.0" ; + rankdir="BT"; + } + } + } + + +We can now accumulate the different ``uploadedX.json`` files into a single package list ``uploaded.json`` that contains everything: + +.. code-block:: bash + + $ conan pkglist merge -l uploaded0.json -l uploaded1.json + -l uploaded2.json -l uploaded3.json + --format=json > uploaded.json + + +And finally, if everything worked well, and we consider this new set of versions and new package binaries is ready to be used by developers and other CI jobs, then we can run the final promotion from the ``products`` to the ``develop`` repository: + +.. code-block:: bash + :caption: Promoting from products->develop + + # Promotion using Conan download/upload commands + # (slow, can be improved with art:promote custom command) + $ conan download --list=uploaded.json -r=products --format=json > promote.json + $ conan upload --list=promote.json -r=develop -c + + +And our final ``develop`` repository state will be: + + +.. graphviz:: + :align: center + + digraph repositories { + node [fillcolor="lightskyblue", style=filled, shape=box] + rankdir="LR"; + subgraph cluster_0 { + label="Packages server"; + style=filled; + color=lightgrey; + subgraph cluster_1 { + label = "packages\n repository" + shape = "box"; + style=filled; + color=lightblue; + "packages" [style=invis]; + "ai/1.1.0\n (Release)"; + "ai/1.1.0\n (Debug)"; + } + subgraph cluster_2 { + label = "products\n repository" + shape = "box"; + style=filled; + color=lightblue; + "products" [style=invis]; + } + subgraph cluster_3 { + rankdir="BT"; + shape = "box"; + label = "develop repository"; + color=lightblue; + rankdir="BT"; + + node [fillcolor="lightskyblue", style=filled, shape=box] + "game/1.0" -> "engine/1.0" -> "ai/1.0" -> "mathlib/1.0"; + "engine/1.0" -> "graphics/1.0" -> "mathlib/1.0"; + "mapviewer/1.0" -> "graphics/1.0"; + "game/1.0" [fillcolor="lightgreen"]; + "mapviewer/1.0" [fillcolor="lightgreen"]; + "ai/promoted" [label="ai/1.1.0\n(new version)"]; + "engine/promoted" [label="engine/1.0\n(new binary)"]; + "game/promoted" [label="game/1.0\n(new binary)", fillcolor="lightgreen"]; + "game/promoted" -> "engine/promoted" -> "ai/promoted" -> "mathlib/1.0"; + "engine/promoted" -> "graphics/1.0"; + } + { + edge[style=invis]; + "packages" -> "products" -> "game/1.0" ; + rankdir="BT"; + } + } + } + + +This state of the ``develop`` repository will have the following behavior: + +- Developers installing ``game/1.0`` or ``engine/1.0`` will by default resolve to latest ``ai/1.1.0`` and use it. They will find pre-compiled binaries for the dependencies too, and they can continue developing using the latest set of dependencies. +- Developers and CI that were using a lockfile that was locking ``ai/1.0`` version, will still be able to keep working with that dependency without anything breaking, as the new versions and package binaries do not break or invalidate the previous existing binaries. + diff --git a/ci_tutorial/products_pipeline/multi_product.rst b/ci_tutorial/products_pipeline/multi_product.rst new file mode 100644 index 00000000000..54f82aa85d5 --- /dev/null +++ b/ci_tutorial/products_pipeline/multi_product.rst @@ -0,0 +1,133 @@ +Products pipeline: multi-product multi-configuration builds +=========================================================== + +In the previous section we computed a ``conan graph build-order`` with several simplifications, we didn't take the ``mapviewer`` product into account, and we processed only 1 configuration. + +In real scenarios, it will be necessary to manage more than one product and the most common case is that there is more than one configuration for every product. If we build these different cases sequentially it will be much slower and inefficient, and if we try to build them in parallel there will easily be many duplicated and unnecessary builds of the same packages, wasting resources and even producing issues as race conditions or traceability problems. + +To avoid this issue, it is possible to compute a single unified "build-order" that aggregates all the different build-orders that are computed for the different products and configurations. + +Let's start as usual cleaning the local cache and defining the correct repos: + +.. code-block:: bash + + # First clean the local "build" folder + $ pwd # should be /examples2/ci/game + $ rm -rf build # clean the temporary build folder + $ mkdir build && cd build # To put temporary files + + # Now clean packages and define remotes + $ conan remove "*" -c # Make sure no packages from last run + + # If you did in previous sections, NO need to repeat this + $ conan remote remove "*" # Make sure no other remotes defined + # Add products repo, you might need to adjust this URL + # NOTE: The products repo is added first, it will have higher priority. + $ conan remote add products http://localhost:8081/artifactory/api/conan/products + # Add develop repo, you might need to adjust this URL + $ conan remote add develop http://localhost:8081/artifactory/api/conan/develop + + +Now, we will start computing the build-order for ``game/1.0`` for the 2 different configurations that we are building in this tutorial, debug and release: + +.. code-block:: bash + + $ conan graph build-order --requires=game/1.0 --build=missing + --order-by=recipe --format=json > game_release.json + $ conan graph build-order --requires=game/1.0 --build=missing + --order-by=recipe -s build_type=Debug --format=json > game_debug.json + +These commands are basically the same as in the previous section, each one with a different configuration and creating a different output file ``game_release.json`` and ``game_debug.json``. These files will be similar to the previous ones, but as we haven't used the ``--reduce`` argument (this is important!) they will actually contain a "build-order" of all elements in the graph, even if only some contain the ``binary: Build`` definition, and others will contain other ``binary: Download|Cache|etc``. + +Now, let's compute the build-order for ``mapviewer/1.0``: + +.. code-block:: bash + + $ conan graph build-order --requires=mapviewer/1.0 --build=missing + --order-by=recipe --format=json > mapviewer_release.json + $ conan graph build-order --requires=mapviewer/1.0 --build=missing + --order-by=recipe -s build_type=Debug --format=json > mapviewer_debug.json + + +Note that in the generated ``mapviewer_xxx.json`` build-order files, there will be only 1 element for ``mapviewer/1.0`` that contains a ``binary: Download``, because there is really no other package to be built, and as ``mapviewer`` is an application linked statically, Conan knows that it can "skip" its dependencies binaries. If we had used the ``--reduce`` argument we would have obtained an empty ``order``. But this is not an issue, as the next final step will really compute what needs to be built. + +Let's take all the 4 different "build-order" files (2 products x 2 configurations each), and merge them together: + +.. code-block:: bash + + $ conan graph build-order-merge + --file=game_release.json --file=game_debug.json + --file=mapviewer_release.json --file=mapviewer_debug.json + --reduce --format=json > build_order.json + + +Now we have applied the ``--reduce`` argument to produce a final ``build_order.json`` that is ready for distribution to the build agents and it only contains those specific packages that need to be built: + +.. code-block:: json + + { + "order_by": "recipe", + "reduced": true, + "order": [ + [ + { + "ref": "engine/1.0#fba6659c9dd04a4bbdc7a375f22143cb", + "packages": [ + [ + { + "package_id": "de738ff5d09f0359b81da17c58256c619814a765", + "filenames": ["game_release"], + "build_args": "--requires=engine/1.0 --build=engine/1.0", + }, + { + "package_id": "cbeb3ac76e3d890c630dae5c068bc178e538b090", + "filenames": ["game_debug"], + "build_args": "--requires=engine/1.0 --build=engine/1.0", + + } + ] + ] + } + ], + [ + { + "ref": "game/1.0#1715574045610faa2705017c71d0000e", + "packages": [ + [ + { + "package_id": "bac7cd2fe1592075ddc715563984bbe000059d4c", + "filenames": ["game_release"], + "build_args": "--requires=game/1.0 --build=game/1.0", + }, + { + "package_id": "01fbc27d2c156886244dafd0804eef1fff13440b", + "filenames": ["game_debug"], + "build_args": "--requires=game/1.0 --build=game/1.0", + } + ] + ] + } + ] + ] + } + + +This build order summarizes the necessary builds. First it is necessary to build all different binaries for ``engine/1.0``. This recipe contains 2 different binaries, one for Release and the other for Debug. These binaries belong to the same element in the ``packages`` list, which means they do not depend on each other and can be built in parallel. Each binary tracks its own original build-order file with ``"filenames": ["game_release"],`` so it is possible to deduce the necessary profiles to apply to it. + +Then, after all binaries of ``engine/1.0`` have been built, it is possible to proceed to build the different binaries for ``game/1.0``. It also contains 2 different binaries for its debug and release configurations, which can be built in parallel. + +In practice, this would mean something like: + +.. code-block:: bash + + # This 2 could be executed in parallel + # (in different machines, or different Conan caches) + $ conan install --requires=engine/1.0 --build=engine/1.0 + $ conan install --requires=engine/1.0 --build=engine/1.0 -s build_type=Debug + + # Once engine/1.0 builds finish, it is possible + # to build these 2 binaries in parallel (in different machines or caches) + $ conan install --requires=game/1.0 --build=game/1.0 + $ conan install --requires=game/1.0 --build=game/1.0 -s build_type=Debug + +In this section we have still omitted some important implementation details that will follow in next sections. The goal was to focus on the ``conan graph build-order-merge`` command and how different products and configurations can be merged in a single "build-order". diff --git a/ci_tutorial/products_pipeline/single_configuration.rst b/ci_tutorial/products_pipeline/single_configuration.rst new file mode 100644 index 00000000000..cfd427f46f2 --- /dev/null +++ b/ci_tutorial/products_pipeline/single_configuration.rst @@ -0,0 +1,151 @@ +Products pipeline: single configuration +======================================= + +In this section we will implement a very basic products pipeline, without distributing the build, without using lockfiles or building multiple configurations. + +The main idea is to illustrate the need to rebuild some packages because there is a new ``ai/1.1.0`` version that can be integrated by our main products. This new ``ai`` version is in the ``products`` repository, as it was already succesfully built by the "packages pipeline". +Let's start by making sure we have a clean environment with the right repositories defined: + +.. code-block:: bash + + # First clean the local "build" folder + $ pwd # should be /examples2/ci/game + $ rm -rf build # clean the temporary build folder + $ mkdir build && cd build # To put temporary files + + # Now clean packages and define remotes + $ conan remove "*" -c # Make sure no packages from last run + $ conan remote remove "*" # Make sure no other remotes defined + # Add products repo, you might need to adjust this URL + # NOTE: The products repo is added first, it will have higher priority. + $ conan remote add products http://localhost:8081/artifactory/api/conan/products + # Add develop repo, you might need to adjust this URL + $ conan remote add develop http://localhost:8081/artifactory/api/conan/develop + + +Note that the ``products`` repo is added first, so it will have higher priority than the ``develop`` repo. It means Conan will resolve first in the ``products`` repo, if it finds a valid version for the defined version ranges, it will stop there and return that version, without +checking the ``develop`` repo (checking all repositories can be done with ``--update``, but that would be slower and with the right repository ordering, it is not necessary). + +As we have already defined, our main products are ``game/1.0`` and ``mapviewer/1.0``, let's start by trying to install and use ``mapviewer/1.0``: + + +.. code-block:: bash + + $ conan install --requires=mapviewer/1.0 + ... + Requirements + graphics/1.0#24b395ba17da96288766cc83accc98f5 - Downloaded (develop) + mapviewer/1.0#c4660fde083a1d581ac554e8a026d4ea - Downloaded (develop) + mathlib/1.0#f2b05681ed843bf50d8b7b7bdb5163ea - Downloaded (develop) + ... + Install finished successfully + + # Activate the environment and run the executable + # Use "conanbuild.bat && mapviewer" in Windows + $ source conanrun.sh && mapviewer + ... + graphics/1.0: Checking if things collide (Release)! + mapviewer/1.0:serving the game (Release)! + + +As we can see, ``mapviewer/1.0`` doesn't really depend on ``ai`` package at all, not any version. +So if we install it, we would already have a pre-compiled binary for it and everything works. + +But if we now try the same with ``game/1.0``: + +.. code-block:: bash + + $ conan install --requires=game/1.0 + ... + Requirements + ai/1.1.0#01a885b003190704f7617f8c13baa630 - Downloaded (products) + engine/1.0#fba6659c9dd04a4bbdc7a375f22143cb - Downloaded (develop) + game/1.0#1715574045610faa2705017c71d0000e - Downloaded (develop) + graphics/1.0#24b395ba17da96288766cc83accc98f5 - Cache + mathlib/1.0#f2b05681ed843bf50d8b7b7bdb5163ea - Cache + ERROR: Missing binary: game/1.0:bac7cd2fe1592075ddc715563984bbe000059d4c + + game/1.0: WARN: Cant find a game/1.0 package binary bac7cd2fe1592075ddc715563984bbe000059d4c for the configuration: + ... + [requires] + ai/1.1.0#01a885b003190704f7617f8c13baa630 + +It will fail, because it will get ``ai/1.1.0`` from the ``products`` repo, and there will be no pre-compiled binary for ``game/1.0`` against this new version of ``ai``. This is correct, ``ai`` is a static library, so we need to re-build ``game/1.0`` against it, let's do it using the ``--build=missing`` argument: + +.. code-block:: bash + + $ conan install --requires=game/1.0 --build=missing + ... + ======== Computing necessary packages ======== + Requirements + ai/1.1.0:8b108997a4947ec6a0487a0b6bcbc0d1072e95f3 - Download (products) + engine/1.0:de738ff5d09f0359b81da17c58256c619814a765 - Build + game/1.0:bac7cd2fe1592075ddc715563984bbe000059d4c - Build + graphics/1.0:8b108997a4947ec6a0487a0b6bcbc0d1072e95f3 - Download (develop) + mathlib/1.0:4d8ab52ebb49f51e63d5193ed580b5a7672e23d5 - Download (develop) + + -------- Installing package engine/1.0 (4 of 5) -------- + engine/1.0: Building from source + ... + engine/1.0: Package de738ff5d09f0359b81da17c58256c619814a765 created + -------- Installing package game/1.0 (5 of 5) -------- + game/1.0: Building from source + ... + game/1.0: Package bac7cd2fe1592075ddc715563984bbe000059d4c created + Install finished successfully + +Note the ``--build=missing`` knows that ``engine/1.0`` also needs a new binary as a result of its dependency to the new ``ai/1.1.0`` version. Then, Conan proceeds to build the packages in the right order, first ``engine/1.0`` has to be built, because ``game/1.0`` depends on it. After the build we can list the new built binaries and see how they depend on the new versions: + +.. code-block:: bash + + $ conan list engine:* + Local Cache + engine + engine/1.0 + revisions + fba6659c9dd04a4bbdc7a375f22143cb (2024-09-30 12:19:54 UTC) + packages + de738ff5d09f0359b81da17c58256c619814a765 + info + ... + requires + ai/1.1.Z + graphics/1.0.Z + mathlib/1.0.Z + + $ conan list game:* + Local Cache + game + game/1.0 + revisions + 1715574045610faa2705017c71d0000e (2024-09-30 12:19:55 UTC) + packages + bac7cd2fe1592075ddc715563984bbe000059d4c + info + ... + requires + ai/1.1.0#01a885b003190704f7617f8c13baa630:8b108997a4947ec6a0487a0b6bcbc0d1072e95f3 + engine/1.0#fba6659c9dd04a4bbdc7a375f22143cb:de738ff5d09f0359b81da17c58256c619814a765 + graphics/1.0#24b395ba17da96288766cc83accc98f5:8b108997a4947ec6a0487a0b6bcbc0d1072e95f3 + mathlib/1.0#f2b05681ed843bf50d8b7b7bdb5163ea:4d8ab52ebb49f51e63d5193ed580b5a7672e23d5 + +The new ``engine/1.0:de738ff5d09f0359b81da17c58256c619814a765`` binary depends on ``ai/1.1.Z``, because as it is a static library it will only require re-builds for changes in the minor version, but not patches. While the ``game/1.0`` new binary will depend on the full exact ``ai/1.1.0#revision:package_id``, and also on the new ``engine/1.0:de738ff5d09f0359b81da17c58256c619814a765`` new binary that depends on ``ai/1.1.Z``. + +Now the game can be executed: + +.. code-block:: bash + + # Activate the environment and run the executable + # Use "conanbuild.bat && game" in Windows + $ source conanrun.sh && game + mathlib/1.0: mathlib maths (Release)! + ai/1.1.0: SUPER BETTER Artificial Intelligence for aliens (Release)! + ai/1.1.0: Intelligence level=50 + graphics/1.0: Checking if things collide (Release)! + engine/1.0: Computing some game things (Release)! + game/1.0:fun game (Release)! + +We can see that the new ``game/1.0`` binary incorporates the improvements in ``ai/1.1.0``, and links correctly with the new binary for ``engine/1.0``. + +And this is a basic "products pipeline", we manage to build and test our main products when necessary (recall that ``mapviewer`` wasn't really affected, so no rebuilds were necessary at all). +In general, a production "products pipeline" will finish uploading the built packages to the repository and running a new promotion to the ``develop`` repo. But as this was a very basic and simple pipeline, let's wait a bit for that, and let's continue with more advanced scenarios. diff --git a/ci_tutorial/project_setup.rst b/ci_tutorial/project_setup.rst new file mode 100644 index 00000000000..256f89b0175 --- /dev/null +++ b/ci_tutorial/project_setup.rst @@ -0,0 +1,103 @@ +Project setup +============= + +The code necessary for this tutorial is found in the ``examples2`` repo, clone it and +move to the folder: + + +.. code-block:: bash + + $ git clone https://github.com/conan-io/examples2.git + $ cd examples2/ci/game + + +Server repositories setup +------------------------- + +We need 3 different repositories in the same server. Make sure to have an Artifactory running and available. You can download the free :ref:`Artifactory CE` from the `downloads page `_ and run it in your own computer, or you can use docker: + + +.. code-block:: bash + + $ docker run --name artifactory -d -p 8081:8081 -p 8082:8082 releases-docker.jfrog.io/jfrog/artifactory-cpp-ce:7.63.12 + # Can be stopped with "docker stop artifactory" + +When you launch it, you can go to http://localhost:8081/ to check it (user: "admin", password: "password"). +If you have another available Artifactory, it can be used too if you can create new repositories there. + + +✍️ As a first step, log into the web UI and **create 3 different local repositories** called ``develop``, ``packages`` and ``products``. + +✍️ Then according to the ``project_setup.py`` file, these are the necessary environment variables to configure the server. Please define ``ARTIFACTORY_URL``, ``ARTIFACTORY_USER`` and/or ``ARTIFACTORY_PASSWORD`` if necessary to adapt to your setup: + +.. code-block:: python + + # TODO: This must be configured by users + SERVER_URL = os.environ.get("ARTIFACTORY_URL", "http://localhost:8081/artifactory/api/conan") + USER = os.environ.get("ARTIFACTORY_USER", "admin") + PASSWORD = os.environ.get("ARTIFACTORY_PASSWORD", "password") + + +Initial dependency graph +------------------------ + +.. warning:: + + - The initialization of the project will remove the contents of the 3 ``develop``, ``products`` and ``packages`` repositories in the server. + - The ``examples2/ci/game`` folder contains a ``.conanrc`` file that defines a local cache, so commands executed in this tutorial do not pollute or alter your main Conan cache. + + +.. code-block:: bash + + $ python project_setup.py + +This will do several tasks, clean the server repos, create initial ``Debug`` and ``Release`` binaries for the dependency graph and upload them to the ``develop`` repo, then clean the local cache. Note in this example we are using ``Debug`` and ``Release`` as our different configurations for convenience, but in real cases these would be different configurations such as Windows/X86_64, Linux/x86_64, Linux/armv8, etc., running +in different computers. + +This dependency graph of packages in the ``develop`` repo is the starting point for our tutorial, assumed as a functional and stable "develop" state of the project that developers can ``conan install`` to work in any of the different packages. + +.. graphviz:: + :align: center + + digraph repositories { + node [fillcolor="lightskyblue", style=filled, shape=box] + rankdir="LR"; + subgraph cluster_0 { + label="Packages server"; + style=filled; + color=lightgrey; + subgraph cluster_1 { + label = "packages\n repository" + shape = "box"; + style=filled; + color=lightblue; + "packages" [style=invis]; + } + subgraph cluster_2 { + label = "products\n repository" + shape = "box"; + style=filled; + color=lightblue; + "products" [style=invis]; + } + subgraph cluster_3 { + rankdir="BT"; + shape = "box"; + label = "develop repository"; + color=lightblue; + rankdir="BT"; + + node [fillcolor="lightskyblue", style=filled, shape=box] + "game/1.0" -> "engine/1.0" -> "ai/1.0" -> "mathlib/1.0"; + "engine/1.0" -> "graphics/1.0" -> "mathlib/1.0"; + "mapviewer/1.0" -> "graphics/1.0"; + "game/1.0" [fillcolor="lightgreen"]; + "mapviewer/1.0" [fillcolor="lightgreen"]; + } + { + edge[style=invis]; + "packages" -> "products" -> "game/1.0" ; + rankdir="BT"; + } + } + } \ No newline at end of file diff --git a/ci_tutorial/tutorial.rst b/ci_tutorial/tutorial.rst new file mode 100644 index 00000000000..5bc3a7d9d6d --- /dev/null +++ b/ci_tutorial/tutorial.rst @@ -0,0 +1,127 @@ +.. _ci_tutorial: + +Continuous Integration (CI) tutorial +==================================== + +.. note:: + + - This is an advanced topic, previous knowledge of Conan is necessary. Please :ref:`read and practice the user tutorial` first. + - This section is intended for devops and build engineers designing and implementing a CI pipeline involving Conan packages, if it is not the + case, you can skip this section. + + +Continuous Integration has different meanings for different users and organizations. In this tutorial we will cover the scenarios when users +are doing changes to the source code of their packages and want to automatically build new binaries for those packages and also compute if those new package changes integrate cleanly or break the organization main products. + +In this tutorial we will use this small project that uses several packages (static libraries by default) to build a couple of applications, a video game and a map viewer utility. The ``game`` and ``mapviewer`` are our final "**products**", what we distribute to our users: + +.. graphviz:: + :align: center + + digraph game { + node [fillcolor="lightskyblue", style=filled, shape=box] + rankdir="BT" + "game/1.0" -> "engine/1.0" -> "ai/1.0" -> "mathlib/1.0"; + "engine/1.0" -> "graphics/1.0" -> "mathlib/1.0"; + "mapviewer/1.0" -> "graphics/1.0"; + "game/1.0" [fillcolor="lightgreen"]; + "mapviewer/1.0" [fillcolor="lightgreen"]; + { + rank = same; + edge[ style=invis]; + "game/1.0" -> "mapviewer/1.0" ; + rankdir = LR; + } + } + + +All of the packages in the dependency graph have a ``requires`` to its direct dependencies using version ranges, for example, ``game`` contains a ``requires("engine/[>=1.0 <2]")`` so new patch and minor versions of the dependencies will automatically be used without needing to modify the recipes. + +.. note:: + + **Important notes** + + - This section is written as a hands-on tutorial. It is intended to be reproduced by copying the commands in your machine. + - The tutorial presents some of the tools, good practices and common approaches to the CI problem. But there are no silver bullets. + This tutorial is not the unique way that things should be done. + Different organizations might have different needs and priorities, different build services power and budget, different sizes, etc. + The principles and practices presented in the tutorial might need to be adapted. + - However some of the principles and best practices would be general for all approaches. Things like package immutability, using promotions + between repositories and not using the ``channel`` for that purpose are good practices that should be followed. + + +Packages and products pipelines +------------------------------- + +When a developer is doing some changes to a package source code, we will consider 2 different parts or pipelines of the overall system CI: +the **packages pipeline** and the **products pipeline** + +- The **packages pipeline** takes care of building one single package when its code is changed. If necessary it will build it for different configurations. +- The **products pipeline** takes care of building the main organization "products" (the packages that implement the final applications or deliverables), + and making sure that changes and new versions in dependencies integrate correctly, rebuilding any intermediate packages in the graph if necessary. + +The idea is that if some developer does changes to the ``ai`` package, producing a new ``ai/1.1.0`` version, the packages pipeline will first build this +new version. But this new version might accidentally break or require rebuilding some consumer packages. If our organization main **products** are +``game/1.0`` and ``mapviewer/1.0``, then the products pipeline can be triggered, in this case it would rebuild ``engine/1.0`` and ``game/1.0`` as +they are affected by the change. + + +Repositories and promotions +--------------------------- + +The concept of multiple server side repositories is very important for CI. In this tutorial we will use 3 repositories: + +- ``develop``: This repository is the main one that developers have configured in their machines to be able to ``conan install`` dependencies + and work. As such it is expected to be quite stable, similar to a shared "develop" branch in git, and the repository should contain pre-compiled + binaries for the organization's pre-defined platforms, so developers and CI don't need to do ``--build=missing`` and build again and again from + source. +- ``packages``: This repository will be used to temporarily upload the packages built by the "packages pipeline", to not upload them directly to + the ``develop`` repo and avoid disruption until these packages are fully validated. +- ``products``: This repository will be used to temporarily upload the packages built by the "products pipeline", while building and testing that + new dependencies changes do not break the main "products". + +.. graphviz:: + :align: center + + digraph repositories { + node [fillcolor="lightskyblue", style=filled, shape=box] + rankdir="LR"; + subgraph cluster_0 { + style=filled; + color=lightgrey; + rankdir="LR"; + label = "Packages server"; + "packages\n repository" -> "products\n repository" -> "develop\n repository" [ label="promotion" ]; + } + + } + +Promotions are the mechanism used to make packages available from one pipeline to the other. Connecting the above packages and product pipelines +with the repositories, there will be 2 promotions: + +- When all the different binaries for the different configurations have been built for a single package with the ``packages pipeline``, and uploaded + to the ``packages`` repository, the new version and changes to the package can be considered "correct" and promoted (copied) to the ``products`` + repository. +- When the ``products pipeline`` has built from source all the necessary packages that need a re-build because of the new package versions in + the ``products`` repository and has checked that the organization "products" (such ``game/1.0`` and ``mapviewer/1.0``) are not broken, then + the packages can be promoted (copied) from the ``products`` repo to the ``develop`` repo, to make them available for all other developers and CI. + + +.. note:: + + - The concept of **immutability** is important in package management and devops. Modifying ``channel`` is strongly discouraged, see :ref:`Package promotions`. + - The versioning approach is important. This tutorial will be following :ref:`the default Conan versioning approach, see details here` + +This tutorial is just modeling the **development** flow. In production systems, there will be other repositories +and promotions, like a ``testing`` repository for the QA team, and a final ``release`` repository for final users, such that packages can +be promoted from ``develop`` to ``testing`` to ``release`` as they pass validation. Read more about promotions in :ref:`Package promotions`. + + +Let's start with the tutorial, move to the next section to do the project setup: + +.. toctree:: + :maxdepth: 2 + + project_setup + packages_pipeline + products_pipeline diff --git a/devops.rst b/devops.rst deleted file mode 100644 index 08c2f353530..00000000000 --- a/devops.rst +++ /dev/null @@ -1,21 +0,0 @@ -.. _devops: - - -Devops guide -============ - -The previous tutorial section was aimed at users in general and developers. - -This section is intended for DevOps users, build and CI engineers, administrators, and architects adopting, designing and implementing Conan in production in their teams and organizations. -If you plan to use Conan in production in your project, team, or organization, this section contains the necessary information. - -.. toctree:: - :maxdepth: 1 - - devops/using_conancenter - devops/devops_local_recipes_index - devops/backup_sources/sources_backup - devops/metadata - devops/versioning - devops/save_restore - devops/vendoring diff --git a/devops/devops.rst b/devops/devops.rst new file mode 100644 index 00000000000..1f1ffb79407 --- /dev/null +++ b/devops/devops.rst @@ -0,0 +1,24 @@ +.. _devops: + + +Devops guide +============ + +The previous :ref:`tutorial` section was aimed at users in general and developers. + +The :ref:`Continuous Integration tutorial` explained the basics on how to implement Continuous Integration involving Conan packages. + +This section is intended for DevOps users, build and CI engineers, administrators, and architects adopting, designing and implementing Conan in production in their teams and organizations. +If you plan to use Conan in production in your project, team, or organization, this section contains the necessary information. + +.. toctree:: + :maxdepth: 1 + + using_conancenter + devops_local_recipes_index + backup_sources/sources_backup + metadata + versioning/versioning + save_restore + vendoring + package_promotions diff --git a/devops/package_promotions.rst b/devops/package_promotions.rst new file mode 100644 index 00000000000..400a59b5c06 --- /dev/null +++ b/devops/package_promotions.rst @@ -0,0 +1,162 @@ +.. _devops_package_promotions: + +Package promotions +================== + +Package promotions are the recommended devops practice to handle quality, maturity or stages of packages +in different technologies, and of course, also for Conan packages. + +The principle of package promotions is that there are multiple server package repositories defined and +packages are uploaded and copied among repositories depending on the stage. For +example we could have two different server package repositories called "testing" and "release": + +.. graphviz:: + :align: center + + digraph repositories { + node [fillcolor="lightblue", style=filled, shape=box] + rankdir="LR"; + subgraph cluster_0 { + style=filled; + color=lightgrey; + rankdir="LR"; + label = "Packages server"; + "testing\n repository" -> "release\n repository" [ label="promotion" ]; + } + } + + +.. note:: + + **Best practices** + + - Using different ``user/channel`` to try to denote maturity is strongly discouraged. It was described in the early + Conan 1 days years ago, before the possibility of having multiple repositories, but it shouldn't be used anymore. + - Packages should be completely immutable accross pipelines and stages, a package cannot rename or change its ``user/channel``, + and re-building it from source to have a new ``user/channel`` is also a strongly discourage devops practice. + + +Between those repositories there will be some quality gates. In our case, some packages will be +put in the "testing" repository, for the QA team to test them, for example ``zlib/1.3.1`` and ``openssl/3.2.2``: + +.. graphviz:: + :align: center + + digraph repositories { + node [fillcolor="lightskyblue", style=filled, shape=box] + rankdir="LR"; + subgraph cluster_0 { + label="Packages server"; + style=filled; + color=lightgrey; + subgraph cluster_1 { + label = "testing\n repository" + shape = "box"; + style=filled; + color=lightblue; + "zlib/1.3.1"; + "openssl/3.2.2"; + } + + subgraph cluster_2 { + label = "release\n repository" + shape = "box"; + style=filled; + color=lightblue; + "release" [style=invis]; + } + { + edge[style=invis]; + "zlib/1.3.1" -> "release" ; + rankdir="BT"; + } + } + } + + +When the QA team tests and approves these packages, they can be promoted to the "release" repository. +Basically, a promotion is a copy of the packages, including all the artifacts and metadata from the +"testing" to the "release" repository. + + +There are different ways to implement and execute a package promotion. Artifactoy has some APIs that can be +used to move individual files or folders. The `Conan extensions repository `_ +contains the ``conan art:promote`` command that can be used to promote Conan "package lists" from one +server repository to another repository. + +If we have a package list ``pkglist.json`` that contains the above ``zlib/1.3.1`` and ``openssl/3.2.2`` packages, then +the command would look like: + +.. code-block:: bash + :caption: Promoting from testing->release + + $ conan art:promote pkglist.json --from=testing --to=release --url=https:///artifactory --user= --password= + + +Note that the ``conan art:promote`` command doesn't work with ArtifactoryCE, Pro editions of Artifactory are needed. +The promote functionality can be implemented in these cases with a simple download+upload flow: + +.. code-block:: bash + :caption: Promoting from testing->release + + # Promotion using Conan download/upload commands + # (slow, can be improved with art:promote custom command) + $ conan download --list=promote.json -r=testing --format=json > downloaded.json + $ conan upload --list=downloaded.json -r=release -c + + +After the promotion from "testing" to "release" repository, the packages would be like: + +.. graphviz:: + :align: center + + digraph repositories { + node [fillcolor="lightskyblue", style=filled, shape=box] + rankdir="LR"; + subgraph cluster_0 { + label="Packages server"; + style=filled; + color=lightgrey; + subgraph cluster_1 { + label = "testing\n repository" + shape = "box"; + style=filled; + color=lightblue; + "zlib/1.3.1"; + "openssl/3.2.2"; + } + + subgraph cluster_2 { + label = "release\n repository" + shape = "box"; + style=filled; + color=lightblue; + "zlibpromoted" [label="zlib/1.3.1"]; + "opensslpromoted" [label="openssl/3.2.2"]; + } + { + "zlib/1.3.1" -> "zlibpromoted"; + "openssl/3.2.2" -> "opensslpromoted" [label="Promotion"]; + } + } + } + + +.. note:: + + **Best practices** + + - In modern package servers such as Artifactory package artifacts are **deduplicated**, that is, they do not + take any extra storage when they are copied in different locations, including different repositories. + The **deduplication** is checksum based, so the system is also smart to avoid re-uploading existing artifacts. + This is very important for the "promotions" mechanism: this mechanism is only copying some metadata, so + it can be very fast and it is storage efficient. Pipelines can define as many repositories and promotions + as necessary without concerns about storage costs. + - Promotions can also be done in JFrog platform with ``Release Bundles``. The `Conan extensions repository `_ + also contains one command to generate a release bundle (that can be promoted using the Artifatory API). + + +.. seealso:: + + - :ref:`Using package lists examples ` + - :ref:`Promotions usage in CI ` diff --git a/devops/versioning/default.rst b/devops/versioning/default.rst new file mode 100644 index 00000000000..36155a3d9a0 --- /dev/null +++ b/devops/versioning/default.rst @@ -0,0 +1,70 @@ + +.. _devops_versioning_default: + + +Default versioning approach +---------------------------- + +When doing changes to the source code of a package, and creating such a package, one good practice is to increase the version +of the package to represent the scope and impact of those changes. The "semver" standard specification defines a ``MAJOR.MINOR.PATCH`` +versioning approach with a specific meaning for changing each digit. + +Conan implements versioning based on the "semver" specification, but with some extended capabilities that were demanded by the C and C++ +ecosystems: + +- Conan versions can have any number of digits, like ``MAJOR.MINOR.PATH.MICRO.SUBMICRO...`` +- Conan versions can contain also letters, not only digits, and they are also ordered in alphabetical order, so ``1.a.2`` is older tha ``1.b.1`` for example. +- The version ranges can be equally defined for any number of digits, like ``dependency/[>=1.0.0.0 <1.0.0.10]`` + +Read the :ref:`introduction to versioning` in the tutorial. + +But one very different aspect of C and C++ building model compared to other languages is how the dependencies affect the +binaries of the consumers requiring them. This is described in the :ref:`Conan binary model` reference. + +Basically, when some package changes its version, this can have different effects on the "consumers" of this package, requiring such +"consumers" to do a rebuild from source or not to integrate the new dependency changes. This also depends on the package types, +as the logic changes when linking a shared library or a static library. Conan binary model with ``dependency traits``, ``package_type``, +and the ``package_id`` modes is able to represent this logic and compute efficiently what needs to be rebuilt from source. + +The default Conan behavior can give some hints of what version changes would be recommended when doing different changes to the packages +source code: + +- Not modifying the version typically means that we want Conan automatic + **recipe revisions** to handle that. A common use case is when the C/C++ source code is not modified at all, and only changes + to the ``conanfile.py`` recipe are done. As the source code is the same, we might want to keep the same version number, and + just have a new revision of that version. +- **Patch**: Increasing the **patch** version of a package means that only internal changes were done, in practice it means change to files + that are not public headers of the package. This "patch" version can avoid having to re-build consumers of this package, for + example if the current package getting a new "patch" version is a static library, all other packages that implement static + libraries that depend on this one do not need to be re-built from source, as depending on the same public interface headers + guarantee the same binary. +- **Minor**: If changes are done to package public headers, in an API source compatible way, then the recommendation would be to increase + the **minor** verson of a package. That means that other packages that depend on it will be able to compile without issues, + but as there were modifications in public headers (that could contain C++ templates or other things that could be inlined in + the consumer packages), then those consumer packages need to be rebuilt from source to incorporate these changes. +- **Major**: If API breaking changes are done to the package public headers, then increasing the **major** version is recommended. As the + most common recommended version-range is something like ``dependency/[>1.0 <2]``, where the next major is excluded, that means + that publishing these new versions will not break existing consumers, because they will not be used at all by those consumers, + because their version ranges will exclude them. It will be necessary to modify the consumers recipes and source code (to fix + the API breaking changes) to be able to use the new major version. + + +Note that while this is close to the standard "semver" definition of version and version ranges, the C/C++ compilation model +needs to introduce a new side effect, that of "needing to rebuild the consumers", following the logic explained above in the +``embed`` and ``non_embed`` cases. + + +This is just the default recommended versioning approach, but Conan allows to change these defaults, as it implements an extension of the "semver" standard that allows any number of digits, +letters, etc, and it also allows to change the ``package_id`` modes to define how different versions of the dependencies affect +the consumers binaries. See :ref:`how to customize the dependencies package_id modes`. + + +.. note:: + + **Best practices** + + - It is not recommended to use other package reference fields, as the ``user`` and ``channel`` to represent changes in the source code, + or other information like the git branch, as this becomes "viral" requiring changes in the ``requires`` of the consumers. Furthermore, + they don't implement any logic in the build model with respect to which consumers need to be rebuilt. + - The recommended approach is to use versioning and multiple server repositories to host the different packages, so they don't interfere + with other builds, read :ref:`the Continuous Integration tutorial` for more details. diff --git a/devops/versioning.rst b/devops/versioning/versioning.rst similarity index 79% rename from devops/versioning.rst rename to devops/versioning/versioning.rst index f0fd5febe51..25980c6c107 100644 --- a/devops/versioning.rst +++ b/devops/versioning/versioning.rst @@ -8,4 +8,5 @@ This section deals with different versioning topics: .. toctree:: :maxdepth: 1 - versioning/resolve_prereleases + default + resolve_prereleases diff --git a/index.rst b/index.rst index 86dcb4f0da1..48ce8703eeb 100644 --- a/index.rst +++ b/index.rst @@ -16,7 +16,8 @@ Table of contents: whatsnew installation tutorial - devops + CI Tutorial + devops/devops integrations examples reference diff --git a/reference/binary_model/custom_compatibility.rst b/reference/binary_model/custom_compatibility.rst index 61917385745..56e03b4d550 100644 --- a/reference/binary_model/custom_compatibility.rst +++ b/reference/binary_model/custom_compatibility.rst @@ -85,6 +85,7 @@ Compatibility can be defined globally via the ``compatibility.py`` plugin, in th Check the binary compatibility :ref:`compatibility.py extension `. +.. _reference_binary_model_custom_compatibility_dependencies: Customizing binary compatibility of dependencies versions --------------------------------------------------------- diff --git a/tutorial/creating_packages/create_your_first_package.rst b/tutorial/creating_packages/create_your_first_package.rst index 8ae21c6cab3..28fe1b125df 100644 --- a/tutorial/creating_packages/create_your_first_package.rst +++ b/tutorial/creating_packages/create_your_first_package.rst @@ -152,7 +152,7 @@ Then, several methods are declared: * The ``generate()`` method prepares the build of the package from source. In this case, it could be simplified to an attribute ``generators = "CMakeToolchain"``, but it is left to show this important method. In this case, the execution of ``CMakeToolchain`` ``generate()`` method will create a *conan_toolchain.cmake* file that translates - the Conan ``settings`` and ``options`` to CMake syntax. The ``CMakeDeps`` generator is added for completitude, + the Conan ``settings`` and ``options`` to CMake syntax. The ``CMakeDeps`` generator is added for completeness, but it is not strictly necessary until ``requires`` are added to the recipe. * The ``build()`` method uses the ``CMake`` wrapper to call CMake commands, it is a thin layer that will manage diff --git a/tutorial/versioning/lockfiles.rst b/tutorial/versioning/lockfiles.rst index 1148fc1d292..d05e8b2bb13 100644 --- a/tutorial/versioning/lockfiles.rst +++ b/tutorial/versioning/lockfiles.rst @@ -318,4 +318,4 @@ scripts, and for some advanced CI flows that will be explained later. .. seealso:: - - Continuous Integrations links. + - :ref:`CI tutorial`.