Capturing Git scm information

There are 2 main strategies to handle source code in recipes:

  • Third-party code: When the conanfile.py recipe is packaging third party code, like an open source library, it is typically better to use the source() method to download or clone the sources of that library. This is the approach followed by the conan-center-index repository for ConanCenter.

  • Your own code: When the conanfile.py recipe is packaging your own code, it is typically better to have the conanfile.py in the same repository as the sources. Then, there are 2 alternatives for achieving reproducibility:

    • Using the exports_sources (or export_source() method) to capture a copy of the sources together with the recipe in the Conan package. This is very simple and pragmatic and would be recommended for the majority of cases.

    • For cases when it is not possible to store the sources beside the Conan recipe, for example when the package is to be consumed for someone that shouldn’t have access to the source code at all, then the current scm capture method would be the way.

In the scm capture method, instead of capturing a copy of the code itself, the “coordinates” for that code are captured instead, in the Git case, the url of the repository and the commit. If the recipe needs to build from source, it will use that information to get a clone, and if the user who tries that is not authorized, the process will fail. They will still be able to use the pre-compiled binaries that we distribute, but not build from source or have access to the code.

Let’s see how it works with an example. Please, first clone the sources to recreate this project. You can find them in the examples2 repository on GitHub:

  1. $ git clone https://github.com/conan-io/examples2.git
  2. $ cd examples2/examples/tools/scm/git/capture_scm

There we will find a small “hello” project, containing this conanfile.py:

  1. from conan import ConanFile
  2. from conan.tools.cmake import CMake, cmake_layout
  3. from conan.tools.scm import Git
  4. class helloRecipe(ConanFile):
  5. name = "hello"
  6. version = "0.1"
  7. # Binary configuration
  8. settings = "os", "compiler", "build_type", "arch"
  9. options = {"shared": [True, False], "fPIC": [True, False]}
  10. default_options = {"shared": False, "fPIC": True}
  11. generators = "CMakeDeps", "CMakeToolchain"
  12. def export(self):
  13. git = Git(self, self.recipe_folder)
  14. # save the url and commit in conandata.yml
  15. git.coordinates_to_conandata()
  16. def source(self):
  17. # we recover the saved url and commit from conandata.yml and use them to get sources
  18. git = Git(self)
  19. git.checkout_from_conandata_coordinates()
  20. ...

We need this code to be in its own Git repository, to see how it works in the real case, so please create a folder outside of the examples2 repository, and copy the contents of the current folder there, then:

  1. $ mkdir /home/myuser/myfolder # or equivalent in other OS
  2. $ cp -R . /home/myuser/myfolder # or equivalent in other OS
  3. $ cd /home/myuser/myfolder # or equivalent in other OS
  4. # Initialize the git repo
  5. $ git init .
  6. $ git add .
  7. $ git commit . -m wip
  8. # Finally create the package
  9. $ conan create .
  10. ...
  11. ======== Exporting recipe to the cache ========
  12. hello/0.1: Exporting package recipe: /myfolder/conanfile.py
  13. hello/0.1: Calling export()
  14. hello/0.1: RUN: git status . --short --no-branch --untracked-files
  15. hello/0.1: RUN: git rev-list HEAD -n 1 --full-history -- "."
  16. hello/0.1: RUN: git remote -v
  17. hello/0.1: RUN: git branch -r --contains cb7815a58529130b49da952362ce8b28117dee53
  18. hello/0.1: RUN: git fetch origin --dry-run --depth=1 cb7815a58529130b49da952362ce8b28117dee53
  19. hello/0.1: WARN: Current commit cb7815a58529130b49da952362ce8b28117dee53 doesn't exist in remote origin
  20. This revision will not be buildable in other computer
  21. hello/0.1: RUN: git rev-parse --show-toplevel
  22. hello/0.1: Copied 1 '.py' file: conanfile.py
  23. hello/0.1: Copied 1 '.yml' file: conandata.yml
  24. hello/0.1: Exported to cache folder: /.conan2/p/hello237d6f9f65bba/e
  25. ...
  26. ======== Installing packages ========
  27. hello/0.1: Calling source() in /.conan2/p/hello237d6f9f65bba/s
  28. hello/0.1: Cloning git repo
  29. hello/0.1: RUN: git clone "<hidden>" "."
  30. hello/0.1: Checkout: cb7815a58529130b49da952362ce8b28117dee53
  31. hello/0.1: RUN: git checkout cb7815a58529130b49da952362ce8b28117dee53

Let’s explain step by step what is happening:

  • When the recipe is exported to the Conan cache, the export() method executes, git.coordinates_to_conandata(), which stores the Git URL and commit in the conandata.yml file by internally calling git.get_url_and_commit(). See the Git reference for more information about these methods.

  • This obtains the URL of the repo pointing to the local <local-path>/capture_scm and the commit 8e8764c40bebabbe3ec57f9a0816a2c8e691f559

  • It warns that this information will not be enough to re-build from source this recipe once the package is uploaded to the server and is tried to be built from source in other computer, which will not contain the path pointed by <local-path>/capture_scm. This is expected, as the repository that we created doesn’t have any remote defined. If our local clone had a remote defined and that remote contained the commit that we are building, the scm_url would point to the remote repository instead, making the build from source fully reproducible.

  • The export() method stores the url and commit information in the conandata.yml for future reproducibility.

  • When the package needs to be built from sources and it calls the source() method, it recovers the information from the conandata.yml file inside the git.checkout_from_conandata_coordinates() method, which internally calls git.clone() with it to retrieve the sources. In this case, it will be cloning from the local checkout in <local-path>/capture_scm, but if it had a remote defined, it will clone from it.

Warning

To achieve reproducibility, it is very important for this scm capture technique that the current checkout is not dirty If it was dirty, it would be impossible to guarantee future reproducibility of the build, so git.get_url_and_commit() can raise errors, and require to commit changes. If more than 1 commit is necessary, it would be recommended to squash those commits before pushing changes to upstream repositories.

If we do now a second conan create ., as the repo is dirty we would get:

  1. $ conan create .
  2. hello/0.1: Calling export()
  3. ERROR: hello/0.1: Error in export() method, line 19
  4. scm_url, scm_commit = git.get_url_and_commit()
  5. ConanException: Repo is dirty, cannot capture url and commit: .../capture_scm

This could be solved by cleaning the repo with git clean -xdf, or by adding a .gitignore file to the repo with the following contents (which might be a good practice anyway for source control):

.gitignore

  1. test_package/build
  2. test_package/CMakeUserPresets.json

The capture of coordinates uses the Git.get_url_and_commit() method, that by default does:

  • If the repository is dirty, it will raise an exception

  • If the repository is not dirty, but the commit doesn’t exist in the remote, it will warn, but it will return the local folder as repo url. This way, local commits can be tested without needing to push them to the server. The core.scm:local_url=allow can silence the warning and the core.scm:local_url=block will immediately raise an error: This last value can be useful for CI scenarios, to fail fast and save a build that would have been blocked later in the conan upload.

  • Packages built with local commit will fail if trying to upload them to the server with conan upload as those local commits are not in the server and then the package might not be reproducible. This upload error can be avoided by setting core.scm:local_url=allow.

  • If the repository is not dirty, and the commit exists in the server, it will return the remote URL and the commit.

Credentials management

In the example above, credentials were not necessary, because our local repo didn’t require them. But in real world scenarios, the credentials can be required.

The first important bit is that git.get_url_and_commit() will capture the url of the origin remote. This url must not encode tokens, users or passwords, for several reasons. First because that will make the process not repeatable, and different builds, different users would get different urls, and consequently different recipe revisions. The url should always be the same. The recommended approach is to manage the credentials in an orthogonal way, for example using ssh keys. The provided example contains a Github action that does this:

.github/workflows/hello-demo.yml

  1. name: Build "hello" package capturing SCM in Github actions
  2. run-name: ${{ github.actor }} checking hello-ci Git scm capture
  3. on: [push]
  4. jobs:
  5. Build:
  6. runs-on: ubuntu-latest
  7. steps:
  8. - name: Check out repository code
  9. uses: actions/checkout@v3
  10. with:
  11. ssh-key: ${{ secrets.SSH_PRIVATE_KEY }}
  12. - uses: actions/setup-python@v4
  13. with:
  14. python-version: '3.10'
  15. - uses: webfactory/ssh-agent@v0.7.0
  16. with:
  17. ssh-private-key: ${{ secrets.SSH_PRIVATE_KEY }}
  18. - run: pip install conan
  19. - run: conan profile detect
  20. - run: conan create .

This hello-demo.yml takes care of the following:

  • The checkout actions/checkout@v3 action receives the ssh-key to checkout as git@ instead of https

  • The webfactory/ssh-agent@v0.7.0 action takes care that the ssh key is also activated during the execution of the following tasks, not only during the checkout.

  • It is necessary to setup the SSH_PRIVATE_KEY secret in the Github interface, as well as the deploy key for the repo (with the private and public parts of the ssh-key)

In this way, it is possible to keep completely separated the authentication and credentials from the recipe functionality, without any risk to leaking credentials.

Note

Best practices

  • Do not use an authentication mechanism that encodes information in the urls. This is risky, can easily disclose credentials in logs. It is recommended to use system mechanisms like ssh keys.

  • Doing conan create is not recommended for local development, but instead running conan install and building locally, to avoid too many unnecessary commits. Only when everything works locally, it is time to start checking the conan create flow.