It seems like for the short term, we are using FetchContent to get testing dependencies into Beman libraries. I personally think this is a fine solution for simple test dependencies but I wanted to throw CPM.cmake out there as it is something we use quite a bit and it works quite well for us in our use case at Xstrahl.
One thing I like about it is that you can adapt it’s behavior to use only local packages with CMake flags.
A simple way to include it is found here. It would add another download for building with tests enabled, but this would only apply to devs working on a project. Consumers would not be affected by this.
If we start using our libraries as dependencies on our other libraries (again, probably with FetchContent or CPM), we may have some problems because of that. The most severe typical issues are:
CMake target duplicates that fail CMake builds,
CTest tests of dependencies in our CTest runs,
GTest version conflicts,
long builds and runs (all dependencies will also fetch their testing frameworks and try to run them).
I agree, this is not a way to handle large number of dependencies. But for simple things like GTest or doctest it works well enough and is quite simple.
If we plan on adding dependencies between different beman libraries, it would probably be better to set up a custom package index if using vcpkg or use Conan.
There are Cmake idioms for not building and running tests if you are not the top level project, so we should be able to avoid the repeated testing exponential problems.
It would be very infeasible if we were moving outside the standard library, but since most of it is not compiled into libraries we should also be able to avoid building more than we need for testing the components under test.
But, without some sort of package management system that generates consistent artifacts for a link context, we’re not going to scale much beyond the standard library.
# CMake 3.21 or later is required for PROJECT_IS_TOP_LEVEL
option(MYPROJ_ENABLE_TESTING "..." ${PROJECT_IS_TOP_LEVEL})
if(MYPROJ_ENABLE_TESTING)
add_subdirectory(tests)
endif()
if(PROJECT_IS_TOP_LEVEL)
add_subdirectory(packaging)
else()
# Users are unlikely to be interested in testing this
# project, so don't show it in the basic options
mark_as_advanced(MYPROJ_ENABLE_TESTING)
endif()
This is the recommended idiom for a project meant to be consumed as a cmake project by other cmake projects.
Note - packaging is mentioned here, but we may end up doing it all somewhat differently; the packaging subdir is if you’re using the CMake packaging tools.
My understanding is that this is mitigated by our naming convention which has all target names prefixed with beman.<short_name>.
I think this is mitigated with the technique @Sdowney mentioned above or passing BUILD_TESTING=OFF when invoking FetchContent_Declare which is what we do in beman.exemplar.
I think this is mitigated by FetchContent_Declare since the first call takes precidence over all subsquent calls.
I think this is also mitigated by the BUILD_TESTING=OFFFetchContent_Declare technique.
Can you think of any other issues that we need to think about @mpusz?
I assume you’re talking about local package override. Yeah, I can see that coming in really handy for large projects.
I wonder how you would do that with FetchContent_Declare. I guess you’d have to put a FetchContent_Declare call pointing to the local repository in the top-level CMakeLists.txt file of your project.
For reference, CPMAddPackage is used 3.6k times in GitHub and FetchContent_Declare is used 35.7k times.
I actually meant the USE_LOCAL_PACKAGES flag. You can force CPM to work only with find_package.
CPM seems to create a cache of packages that are added to it and it can decide what to do for dependencies based on if the above environment flag is set or not. If it is it just delegates to find_package and otherwise it uses the regular FetchContent machinery.
For the record, we started using FetchContent because we started tooling in a hackathon and it was trivial to bootstrap.
To me, this discussion thread is evidence that we’re ready to revisit the use of FetchContent and move to proper packaging. I believe effort sunk into developing styles and conventions to support recursive vendoring workflows are better spent getting Conan and/or vcpkg working for us.
I do think we should shoot for FetchContent friendly projects, but there’s a general portability problem when every project needs to detect whether it’s in some environment or another before it decides to perform this or that operation. That’s a pear-shaped approach to an ecosystem consistency problem. We would be expecting every project to be manually maintained to be consistent with every other project. And all projects to get individual updates as our expectations evolve.
It looks like FetchContent_Declare() has a similar variable, FETCHCONTENT_TRY_FIND_PACKAGE_MODE. I wonder if the CMake folks are porting the popular CPM features into fetch content directly.
Craig Scott has been mansterminding the convergence of find_package and FetchContent.
I’m supportive, but at my day job, we need to push beyond find_package and into a more general interface for discovering transitive dependencies. We’ve been taking two approaches to facilitate that:
We’ve stopped using any [1] of FetchContent or find_package directly for dependency resolution. We have a more generic API called RequiredTargets that:
Lets you declare in CMakeLists.txt that you need a dependency
Lets a toolchain (or you if you really want) define how to resolve a named target into a concrete dependency [2].
In one big pass at the end of configuration time, fulfills those requirements
Anything tagets declared before CMakeLists.txt completes
One day, I’d like all of the above be useful and usable by everyone, including Beman libraries. Though it’s definitely a shift from existing CMake recommendations, including “modern CMake”.
[1] OK, we do use find_package in a few exotic cases, but it’s mostly for when the build system needs to enhance, override, or otherwise fill in very specific gaps that normal packaging mechanisms can’t support.
[2] Supported resolution policies include pkg-config, find_package, and (soon, hopefully) CPS files.
I wonder what functionality you find FetchContent lacking. It seems to me like it also supports defining how to resolve named targets with a tool chain.
FetchContent and find_package provide a way to create a specifically IMPORTED target.
The problem with this is that my organization has a lot of workflows in which engineers need to edit several repos simultaneously in order to implement a feature or to troubleshoot problems that span multiple repos. For most dependencies, importing them is the right way to provide them. But when it comes to developing features that span repos or troubleshooting problems that involve multiple projects, it’s extremely helpful to be able to stitch together multiple repos into one build-edit-test workflow. Effectively, we can construct monorepos on the fly. When that’s interesting for development workflows.
Short of a feature like that, Beman will have to invest in a package manager in order to support some version of the same development experience for use cases involving editing multiple Beman repos and/or dependencies of those projects. Even then, you’ll still need to drive build commands through more opaque packaging commands that might not know how to run a unit test or even attribute a build failure to a cmake command versus a ninja command.
Altogether, the status quo is workable. I’m just explaining why I see a better model on the horizon. But I still spend more cycles on packaging standards, so I suppose CMake dependency resolution semantics aren’t my biggest concern this week.
My understanding is that this kind of workflow is possible with FetchContent. Say you have an A → B → C dependency tree, both A and C’s code is checked out, and you’d like to make edits to C and A together as part of one build. There are two ways to do this:
Option 1: Add a FetchContent_Declare using SOURCE_DIR in your top-level CMakelists.txt
Near the top of your top-level CMakeLists.txt for A, add a FetchContent_Declare for C pointing to your local checkout. This overrides all subsequent FetchContent_Declare calls for C (even in dependencies). For example
FetchContent_Declare(
C
SOURCE_DIR ${PROJECT_SOURCE_DIR}/../path/to/C/repo
)
FetchContent_MakeAvailable(C)
Option 2: Set the FETCHCONTENT_SOURCE_DIR_C CMake variable to the path of your C checkout
Your CMake invocation would include -DFETCHCONTENT_SOURCE_DIR_C=../path/to/C/repo
Except in the workflow I propose, you need no temporary edits to cloned CMakeLists.txt. You just add a top level CMakeLists.txt above all cloned repos and as many add_subdirectory calls as you need. Generally, we do this as an automated work area setup step, though it’s possible to write a CMakeLists.txt that automatically finds constituent projects given some modest layout and project structure assumptions.
I guess you could still make a new CMakeLists.txt and have it set FetchContent for everything you want to pull in? I don’t know why I would prefer that to add_subdirectory calls I guess.
To be clear, all this is a little far afield given current Beman practices. I’m OK with whatever find_package or FetchContent hacks we want to use for now. In particular we would at least want an acceptable RequiredTargets package to use that isn’t available now.
I’m just pointing out there is a way forward eventually that involves declaring dependencies instead of using imperative operations like “fetch”.
Could you elaborate more on the problems you have found with CPMAddPackage()? We have used it extensively but I don’t think we’ve run into the limitations you allude to.