Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CMake build system, attempt 2 #461

Merged
merged 42 commits into from
Mar 7, 2016

Conversation

delcypher
Copy link
Contributor

This is a rework of #459 that is no longer a "half-baked" (@nunoplopes) solution as it can now build the necessary generated files itself and so can function with needing to invoke from the Python build system first.

The last ( 98b61b8) commit lists some of the many advantages this build system has.

It will need some additional work to work under MSVC and to support building/installing/uninstalling the python, java, dotnet and ml language bindings but this can be done in the future.

To give this a try do

# You need a clean source tree (i.e. no generated files in the source tree) otherwise CMake will
# complain. First check you are happy with what will be deleted (``-n`` is a dry run).
git clean -nx
# Now delete the untracked files for real
git clean -fx
# Make a build directory
mkdir build
cd build
cmake -DCMAKE_INSTALL_PREFIX=/whatever/install/prefix/you/want   /path/z3/repository/
make -j8
make c_example
make cpp_example
make test-z3 -j8
# Try installing
make install
# Try uninstalling
make uninstall

So far I've tested with make and ninja on Arch Linux with CMake 3.4.3. I will be testing more platforms shortly.

@delcypher
Copy link
Contributor Author

@jirislaby @arrowd feel free to give this a try.

@delcypher
Copy link
Contributor Author

Hmm doesn't build with CMake 2.8.12.2 on Ubuntu 14.04LTS, fixes will be coming soon.

@delcypher
Copy link
Contributor Author

Okay build now works under under Ubuntu 14.04LTS with CMake 2.8.12.2

@delcypher
Copy link
Contributor Author

Okay hit a milestone :)
The CMake MSVC build can now build z3.exe and test-z3.exe.

libz3.dll is built but the symbols aren't exported properly so the c_example and cpp_example can't link against the library yet. I'm working on trying to fix this.

@NikolajBjorner
Copy link
Contributor

I guess these are beginner issues and not issues with the pull request

 C:\bz3\z3\build>cmake C:\bz3\z3\src
      CMake Error at util/CMakeLists.txt:8 (z3_add_component):
      Unknown CMake command "z3_add_component".


      CMake Warning (dev) in CMakeLists.txt:
      No cmake_minimum_required command is present.  A line of code such as

        cmake_minimum_required(VERSION 3.5)

 C:\bz3\z3\build>cmake C:\bz3\z3
    CMake Error: The source "C:/bz3/z3/CMakeLists.txt" does not match the
    source "C:/bz3/z3/src/CMakeLists.txt" used to generate cache.  Re-run cmake with a different source directory.

@delcypher
Copy link
Contributor Author

MSVC build now succeeds :) . The libz3.dll exports are set correctly so the c_example and cpp_example can link against it now.

@delcypher
Copy link
Contributor Author

@NikolajBjorner

I guess these are beginner issues and not issues with the pull request

Yes there are two issues there.

  • You are not pointing CMake at the right directory. You need to give it the path to the directory containing the top level CMakeLists.txt file. This is in the root of the repository and it is the file that contains the line project(Z3 C CXX).
  • It looks like you invoked CMake more than once and used a different path which is why you see the error about the CMake cache. Just delete everything in the build directory and start again.

It is important to understand that CMake is actually a "meta-build system" and not a build system itself. When you run the cmake command it will configure and then generate the build system of your choice (e.g. UNIX makefiles, Ninja files, Visual studio...), these are referred to as "Generators" in CMake parlance. Once the build system is generated, you then use that that to build your project. You may find it easier to use the cmake-gui tool to pick the Generator you want to use and to also set various build options.

I'm going to try to write a guide on using the build system now.

@delcypher
Copy link
Contributor Author

@NikolajBjorner @wintersteiger @nunoplopes
Okay this PR is now in pretty good shape. I've spent quite a bit of time getting the build working under MSVC.

I've tested the following configurations

  • gcc 5.3 and clang 3.71, CMake 3.4.3, Arch Linux. Makefile and Ninja generator
  • gcc 4.8.4, CMake 2.8.12.2, Ubuntu 14.04LTS Makefile and Ninja generator
  • MSVC (under Windows 8) with Visual Studio 2013 (both 32-bit and 64-bit builds), NMake and Ninja generators.

There is documentation in the README-CMake.md file on using the build system. Is there anything else I can do to get the reviewed and merged?

@NikolajBjorner
Copy link
Contributor

cool, bandwidth permitting...

@wintersteiger
Copy link
Contributor

Looking like it's getting there, great! Is it a strict requirement that we have to have CMakeLists.txt file in every single directory? I'd rather see those go into a separate cmake-only directory or some such thing.

@delcypher
Copy link
Contributor Author

@NikolajBjorner Okay.
Ninja on Windows seems pretty cool by the way. Today was the first time I have used it on Windows. It was significantly faster than using NMake because it launches commands in parallel. I've never used NMake before so I'm not sure if there's a way to make it build in parallel but I couldn't find one when I looked.

@NikolajBjorner
Copy link
Contributor

got it to build with yesterday's check-in using VS this time. VS uses threads.

@delcypher
Copy link
Contributor Author

@wintersteiger

Looking like it's getting there, great! Is it a strict requirement that we have to have CMakeLists.txt file in every single directory? I'd rather see those go into a separate cmake-only directory or some such thing.

Technically it is not a strict requirement but practically it is. Re-implementing what I've done to keep the CMakeLists.txt files out of the src directory would be quite painful. Having CMakeLists.txt in the source tree is quite standard though. I personally find it quite useful as well because I can look at the CMakeLists.txt file for a particular Z3 component and see how that component is built (e.g. I can see if it has any special rules for creating generated files).

@delcypher
Copy link
Contributor Author

@NikolajBjorner

got it to build with yesterday's checking using VS this time. VS uses threads.

Great. You will want to use the latest version of this PR though because I didn't have the dll exports working properly yesterday so the c_example and cpp_example didn't build properly then. I've since fixed that though.

@wintersteiger
Copy link
Contributor

Re nmake in parallel: in visual studio it's usually the compiler itself (cl) that parallelizes (with /MP), but we also have the --parallel flag on mk_make.py that tricks nmake into parallelism.

@wintersteiger
Copy link
Contributor

Re new files everywhere: This is highly confusing for anybody who wants to debug a build issue, because it is not obvious to them that all those files flying about have nothing to do with what they are trying to achieve (e.g., default build via nmake). I would much rather have the python (or other) scripts that generate code moved into those directories (having everything in /scripts is actually quite painful) which can then be used by all build systems.

Re moving files being painful: Sorry, but every other approach is also painful (except maybe for different persons).

@delcypher
Copy link
Contributor Author

@wintersteiger

Re new files everywhere: This is highly confusing for anybody who wants to debug a build issue, because it is not obvious to them that all those files flying about have nothing to do with what they are trying to achieve (e.g., default build via nmake). I would much rather have the python (or other) scripts that generate code moved into those directories (having everything in /scripts is actually quite painful) which can then be used by all build systems.

I'm afraid I really don't get your logic here. Having CMakeLists.txt next the to sources they describe how to compile is a completely standard (E.g. see the LLVM source tree) and logical way to layout projects. Doing anything different is actually incredibly confusing because it would then become to much harder to work how CMake was being told how to build Z3's components. It would also be incredibly confusing for anyone who knows how to use CMake.

Also if there is an issue during configure CMake will tell in which file and at what line there is a problem in a CMakeLists.txt.

Perhaps I'm not picturing exactly what you mean when you say highly confusing for anybody who wants to debug a build issue. I see no confusion arising from the way the CMakeLists.txt are organised now but perhaps you could explain precisely a scenario that you think is confusing?

@NikolajBjorner
Copy link
Contributor

I believe it is a matter of getting used to.
Most CMakeLists.txt look easy to maintain and extend and basically capture the dependencies that had to be stated (+ making cpp inclusions explicit, which is by default in the python based scripts).
I found two scary ones in shell and test, but then they factor functionality that have
presumably been stated deep inside one of the python scripts.

@wintersteiger
Copy link
Contributor

Yes, I agree if there is only one build system and if everybody is familiar with it, but those assumption don't hold here. Whether this is "completely standard" in other projects is of no relevance, Z3 is "not standard" in many respects, and yes, we can try make it look "more standard", but that will be a major effort and until we decide to actually go for something like that, I'd rather not disrupt and confuse existing operations and users. So, if it's little bit more effort to make it self-contained and non-disruptive then I think we should go for that.

@delcypher
Copy link
Contributor Author

@NikolajBjorner

Most CMakeLists.txt look easy to maintain and extend and basically capture the dependencies that had to be stated (+ making cpp inclusions explicit, which is by default in the python based scripts).
I found two scary ones in shell and test, but then they factor functionality that have
presumably been stated deep inside one of the python scripts.

Yes shell and test-z3 are less than ideal. I certainly could refactor them. The reason I haven't done so is because in principle you would want both those executables to link against the built libz3. The CMake code would be considerably cleaner if this was done. Unfortunately this doesn't really seem possible right now due to

  • The Z3 library hides most of its symbols so you get linking errors if you try to link shell or test-z3 against the library
  • I'm not sure if this was intentional but the existing build system creates separate install_tactic.cpp, mem_initializer.cpp, gparams_register_modules.cpp for the libz3, shell and test-z3 targets. This would potentially lead to linking problems because there would be multiple implementations of the same functions.

Because the existing build system has this problem I decided that fixing it right now was orthogonal to implementing a CMake build system that largely duplicates the behaviour of the python build system.
If you think there is no immediate need to fix the fact that shell and test-z3 don't link against libz3 I can clean up the CMake code there.

@delcypher
Copy link
Contributor Author

Yes, I agree if there is only one build system and if everybody is familiar with it, but those assumption don't hold here. Whether this is "completely standard" in other projects is of no relevance, Z3 is "not standard" in many respects, and yes, we can try make it look "more standard", but that will be a major effort and until we decide to actually go for something like that, I'd rather not disrupt and confuse existing operations and users. So, if it's little bit more effort to make it self-contained and non-disruptive then I think we should go for that.

I don't think the idea of splitting your projects into components and having each component depend on each other is non-standard. But that is exactly how Z3 is organised and what the CMake build system I've created is designed around.

As for a "little bit more effort".... it's not a little bit of effort. It would be a huge amount of effort with no real benefit. CMake is designed to be used in a particular way (the way I'm using it) and what you're suggesting is not a good fit for how CMake works. Also I am not willing to waste my time compromising the readability and maintainability of the CMake build system. I would much rather use my time doing something useful like teaching the CMake build system how to build and install Z3's language bindings which would bring the CMake build system to full feature parity with the Python build system.

Also the Python and CMake build systems can co-exist fairly painlessly. If you're modifying source files you probably won't need to modify the CMake build system. You would only need to modify it if you're adding a new .cpp file, dependency or build flags or something like that. From what I've observed of Z3's development over the last few months that doesn't seem to happen that often. Even so adding a new source file or component dependency requires only a tiny change to the relevant CMakeLists.txt file.

@wintersteiger
Copy link
Contributor

Before anybody wastes any more time on this, can we please do a rought cost/benefit calculation?

I was under the impression that this contribution should be self-contained, non-disruptive, and basically @delcypher's pet project that will stay invisible to the end-user until they actively want to use it. It seems that the current strategy is to partially or completely replace the existing build system. And at the current time and state of affairs I'm strongly opposed to this, because it will most definitely be a huge load of time and effort for all of us, which nobody will get back in any kind of benefit.

Here are some thoughts:

If we wanted Makefiles in every directory, we would have done so a long time ago, and we would have made them nmake-compatible. In retrospect that may perhaps have been less effort overall, but at the current state I see no reason to change our mind, now that all the kinks have been worked out.

Supporting and maintaining two independent builds systems is a truly terrible idea, for all people that are involved. I am open to discussion about replacing the existing system, but that will have to be much better organized and there will have to be a lot of discussion about what system we think is the best choice. I would much rather see that time and effort to go into refactoring of the existing system to make maintenace less painful, which, at the same time, would be much less disruptive for users as well.

CMake is a great system if you need to build, test, and publish on multiple platforms. More than 99% of our users do not have that requirement. The number of people who do have this requirement is 2 (@NikolajBjorner and myself), and that set of people is perfectly happy with the current system. And further, they have about a 5-year investment of fixing build problems on exotic platforms in the current system. Much of this will have to be re-discovered and re-fixed. I can't speak for @NikolajBjorner, but personally I am not in any way keen on going down that route.

Prospective future users could be found in the set of people that maintain our APIs (currently the same 2), because they have to make sure their APIs work on all platforms. But, at the same time this is also the most complicated part and so far has received no attention as far as I can tell.

Adding yet another tool in the build chain is completely unnecessary complication for "newbie" users. Many of our users are not Comp Sci students and they do not run Ubuntu. If we have to ask them to compile the latest-bugfix version from the master branch, everything will have to go through without a hitch, because they won't have the knowledge or skills to fix any problems along the way. And yes, some of them will have trouble downloading CMake and installing it, and they will have no clue what's going on when they run the magic commands that we tell them to run.

If we were to decide to replace the existing system, then CMake would not be my first choice of tool to use. (E.g., why don't they publish packages for BSDs? Current platforms on my todo list are Raspian, Solaris, OpenBSD. Will we have to give people a tutorial on how to compile CMake first?) Also, I would much rather just have Makefiles in every directory and ./configure scripts, and perhaps a static .vcxproj for Visual Studio.

So, yes, I do appreciate @delcypher's enthusiams about this project, but at this point in time, from my point of view, it's just not cost-effective. This is OK if it stays invisible, but not otherwise. So at this point, I won't merge this PR, but I'm not the only one with write permissions, and if the others think there are benefits that I can't see, then I'm happy to be overruled.

@delcypher
Copy link
Contributor Author

Building and install of python bindings is now implemented. This if off by default but can be enabled by setting the optionBUILD_PYTHON_BINDINGS to TRUE.

I don't plan to implement support for any other bindings right now until this work is merged.

@delcypher
Copy link
Contributor Author

@wintersteiger

Before anybody wastes any more time on this, can we please do a rought cost/benefit calculation?

Sure.

I was under the impression that this contribution should be self-contained, non-disruptive, and basically @delcypher's pet project that will stay invisible to the end-user until they actively want to use it. It seems that the current strategy is to partially or completely replace the existing build system. And at the current time and state of affairs I'm strongly opposed to this, because it will most definitely be a huge load of time and effort for all of us, which nobody will get back in any kind of benefit.

The CMake build system could replace the existing build system but that is not my decision to make. My goal has been to implement a CMake based build system that is readable and is as close to feature parity with the existing build system as possible that I and others can use.

Here are some thoughts:

If we wanted Makefiles in every directory, we would have done so a long time ago, and we would have made them nmake-compatible. In retrospect that may perhaps have been less effort overall, but at the current state I see no reason to change our mind, now that all the kinks have been worked out.

The kinks haven't really all been worked out. Incremental builds are broken with the current build system. I wouldn't have even bothered to have implemented a new build system if the existing build system could do that.

Supporting and maintaining two independent builds systems is a truly terrible idea, for all people that are involved. I am open to discussion about replacing the existing system, but that will have to be much better organized and there will have to be a lot of discussion about what system we think is the best choice. I would much rather see that time and effort to go into refactoring of the existing system to make maintenace less painful, which, at the same time, would be much less disruptive for users as well.

It is only terrible if the maintenance cost is high. Right now I don't see this as being high at all. If you want zero additional overhead then you (and all other Z3 devs) don't even need to care about the build system and develop Z3 as your have been doing so far and I will send PRs when something breaks with CMake build. That way the CMake build system is not in your way all all. If you don't mind a little overhead then you can modify the CMake build system if you really feel like doing it but I'm happy to take on the work alone.

You and I also clearly differ on our opinions on what is well organised. The project layout that CMake uses is very similar to every build system I have ever used (e.g. Automake, SCons, Manual makefiles and even Visual Studio solutions with multiple C# projects) except Z3's python build system. I am also having a really hard time understanding why you think a CMakeLists.txt file inside a directory with source files is going to get in your way.

CMake is a great system if you need to build, test, and publish on multiple platforms. More than 99% of our users do not have that requirement. The number of people who do have this requirement is 2 (@NikolajBjorner and myself), and that set of people is perfectly happy with the current system. And further, they have about a 5-year investment of fixing build problems on exotic platforms in the current system. Much of this will have to be re-discovered and re-fixed. I can't speak for @NikolajBjorner, but personally I am not in any way keen on going down that route.

It is not only great for that. It is also great if you want to integrate well with IDEs (like Xcode, Eclispe CDT and Visual Studio). It also very useful because it implements lots of things build systems need (e.g. setting SOVERSION, RPATH, ..etc) which is hard to get right and the Z3 team should not spend there time implementing in python. With CMake you get all this stuff for free.

Adding yet another tool in the build chain is completely unnecessary complication for "newbie" users. Many of our users are not Comp Sci students and they do not run Ubuntu. If we have to ask them to compile the latest-bugfix version from the master branch, everything will have to go through without a hitch, because they won't have the knowledge or skills to fix any problems along the way. And yes, some of them will have trouble downloading CMake and installing it, and they will have no clue what's going on when they run the magic commands that we tell them to run.

I don't see this as a strong argument. The CMake build system is not intrusive and your users can continue to use the Python builds system without knowledge that it even exists. If they really want to use the CMake build system the documentation is there. Asking for a flawless compile from the master branch also seems like a bad idea because it is unstable and there are frequent breakages. You might be better off getting them to build from a released version or giving them pre-built binaries.

Also your students "have no idea" what is going on when you tell them to run your python scripts to build Z3. Those scripts are incredibly complicated and convoluted. I learnt way more about them than I ever intended whilst trying to replicate it's functionality using CMake.

If we were to decide to replace the existing system, then CMake would not be my first choice of tool to use. (E.g., why don't they publish packages for BSDs? Current platforms on my todo list are Raspian, Solaris, OpenBSD. Will we have to give people a tutorial on how to compile CMake first?) Also, I would much rather just have Makefiles in every directory and ./configure scripts, and perhaps a static .vcxproj for Visual Studio.

You are unlikely to need to tell people to compile CMake from source. It is very widely used project so it available for many platforms. CMake also knows about many more platforms and compilers that the existing build system so using CMake will make it easier to port to other platforms.

For the BSDs, CMake is available from their package manager. Raspian is Debian (not a BSD) so CMake is also available from their package manager. Downloading the CMake binary from their website is something I'd ever do on Windows. Other platforms have package managers which make getting hold of CMake easy.

So, yes, I do appreciate @delcypher's enthusiams about this project, but at this point in time, from my point of view, it's just not cost-effective. This is OK if it stays invisible, but not otherwise. So at this point, I won't merge this PR, but I'm not the only one with write permissions, and if the others think there are benefits that I can't see, then I'm happy to be overruled.

Right now this PR doesn't impact the existing build system in terms of functionality. In fact it improves it because I had to do some refactoring to use some of its functionality to generate some files needed for the build. Because the existing build system is not functionally effected and I've stated that if you don't need to care about the CMake build system if you don't want to I don't see this PR as invasive.

Have you actually tried using the CMake build system I've implemented? It feels very much like you are criticising something you have not actually used. You only need to run a few commands to try it out. I'm more than happy to walk you through using it.

@jirislaby @arrowd You have both expressed interest in the past in using CMake to build Z3. Do you have anything to add?

@wintersteiger
Copy link
Contributor

Sorry, but this is not about whether it's nice and easy to use at all.

Simply put, from my point of view, there is essentially zero benefit for zero people, i.e., not worth investing your time or our time on this. Worse, if it does interfere with the existing system (which it does, see changes to scripts/*.py), then we get negative benefit. But, as a I said, that's just my opinion.

@NikolajBjorner
Copy link
Contributor

I tried the system yesterday, and it was easy enough to build without following instructions.
But I don't have bandwidth today (right now) to update and follow all very detailed discussion.

From what I saw, the cmake system is perfectly reasonable with cross platform, incremental and other hooks that make good sense. I have yet to learn more about this environment. Maintainability and adaptability is my main question, but then cmake offers something the python scripts do not when it comes to various platforms. Although, the python scripts for compile-time generated code
remain required. It adds one dependency on cmake, but cmake was easy to find and install.

@agurfinkel
Copy link
Collaborator

Hey all, I've been silently watching this fascinating discussion. Let me chime in.

First, I want to cast a vote for cmake. A cmake-based build system for Z3 would make a huge difference for me.

However, I understand the pitfalls of having multiple build systems, even when one is official and the other is just there.

I want to suggest a constructive compromise:

  1. Move all cmake related files into some directory, say contrib/cmake
  2. Add a script, say boostrap_cmake.py that installs CMakeLists.txt files all over the source tree
  3. Users of cmake will have to know to call bootstrap_cmake.py after clonning the repo

This makes cmake invisible to users of the current build system at a slight increase in the cost of maintaining cmake build.

This can be re-evaluated after some time, but is a way to move forward right now.

@delcypher
Copy link
Contributor Author

Sorry, but this is not about whether it's nice and easy to use at all.

What? You make an argument based on ease of use, I counter this and argue that it isn't difficult to use and then you tell me that ease of use doesn't matter.

Simply put, from my point of view, there is essentially zero benefit for zero people, i.e., not worth investing your time or our time on this.

That simply isn't true, do you speak for everyone? There are plenty of benefits to offering a CMake based build of Z3.

Advantages of having a CMake build system for Z3 available:

  • Faster incremental builds when using Ninja as the generator. This is very useful for development.
  • Support for IDE integration (e.g. Visual Studio, Xcode, Eclipse CDT, CLion, QtCreator). This is very convenient for users who want to work with Z3's code because they can probably use an IDE of their choice.
  • The CMake build system I've implemented in this PR gives reliable incremental builds. The python based build system doesn't do these properly (it isn't aware if files need to be regenerated, that object files need to be recompiled if compile flags change, if the makefile is changed everything might need to be rebuilt, etc..) meaning you have to throw away an entire build tree everytime you want to build Z3. That is a huge waste of time.
  • The build system types that CMake can generate include those (Makefile and NMake) used by the Python build system are still available so there's no need to learn a different dependency management tool (e.g. Ninja) if you don't want to.
  • Support for more compilers and platforms. CMake is aware of many more compilers and platforms than the Python build system is. Although what's in this PR probably won't work completely out of the box on a platform I've not tested it on it wouldn't take a huge amount of effort to port to another platform if CMake already knows about it.
  • CMake implements many things out of the box that were very painful (e.g. my work adding DESTDIR support) or would be painful (e.g. Support for SOVERSION) to implement in the current build system.
  • The existing build system python build system is very hard to understand once you get into the nitty gritty details of the generated files. The CMake build system is a lot easier to understand because the logic (that's currently all over the place in the mk_utl.py and update_api.py) that is associated with a particular Z3 component is expressed in the CMakeLists.txt file for that component. E.g. take a look at src/api/python/CMakeLists.txt or src/api/CMakeLists.txt.
  • You are a lot more likely to get contributions to the CMake build system because it much more commonly used.
  • Using a more common build system makes life easier for those packaging Z3 (e.g. for Linux Distributions and OSX homebrew).

Disadvantages of having a CMake build system for Z3 available (note I'm saying available, I'm not saying replacing the python build system).

  • Requires learning how to use CMake. This could be a drain on time.
  • If you want to keep the CMake build working then you will need to modify the CMakeLists.txt file if adding new .cpp files (note .h files don't matter, CMake automatically detects the dependencies there) or removing them. You would also need to add a new CMakeLists.txt file if you added a new component (but that is very straight forward to do). If you don't care about keeping CMake build working (and just let me send PRs to fix problems, like I'm suggesting) then it adds no overhead at all as I would do the maintenance.
  • Having two build systems might be a little confusing but if the documentation is clear then there shouldn't be a problem.
  • Currently because CMake build system relies on the python scripts to create the generated files you might need to be more careful about changing update_api.py and mk_util.py. In practice however I don't think this is really a problem because the bits I refactored so I could call into the python code from CMake have not been changed in quite a long time and are unlikely to be changed any time soon.

Worse, if it does interfere with the existing system (which it does, see changes to scripts/*.py), then we get negative benefit. But, as a I said, that's just my opinion.

There is a minor convenience due to the refactor but as I mention those parts of the Python build system hardly ever change. Also I have done you a favour with my changes to update_api.py, that script is a mess and I've made it a little less messy so that I can call into it from CMake.

Dan Liew added 2 commits March 4, 2016 15:26
is applied to targets. The ``LINK_FLAGS`` property of a target is
a string and not a list and so if ``Z3_DEPENDENT_EXTRA_CXX_LINK_FLAGS``
contained more than one flag the linker line would end up being
``-flag1;flag2;flag3;...`` which would not work. Now we use a new
function ``z3_append_linker_flag_list_to_target()`` to iterate through
the list and update the ``LINK_FLAGS`` property of the specified target
correctly.
@delcypher
Copy link
Contributor Author

Rebased and fixed merge conflict with cf5910e

Dan Liew added 2 commits March 4, 2016 15:31
was already marked as executable but it wasn't possible to execute
from a shell due to the missing shebang.
@delcypher
Copy link
Contributor Author

@wintersteiger Are there any additional changes you'd like me to make? I rebased recently to fix a merge conflict so the changes should still merge cleanly.

@wintersteiger
Copy link
Contributor

Sorry, but it looks like you have completely taken apart mk_util.py and all API generation code. This will take a lot longer to review and test than I hoped it would.

@delcypher
Copy link
Contributor Author

@wintersteiger
Yes I basically needed to expose the code that generates various .hpp and .cpp and API files as a python function (that I can then call from a utility script) that basically works with file paths rather than Components so that it is decoupled from the mk_project.py file.

Most of the changes made to mk_util.py are pretty reasonable IMHO. The one change that is quite significant are the changes to update_api.py. As the comments in that file state, it really is a big hack. I've tried to make it less of hack by providing a useful entry point for generating the API files that both the Python and CMake build systems can use.

@delcypher
Copy link
Contributor Author

If it helps. Here are the commits that make the changes to mk_util.py and update_api.apy

9558dc1
ef58179
a134388
2f7f022
2b3fe3d
db34baa
46ac368
f6e9464
8bc7d31

@wintersteiger
Copy link
Contributor

This looks a lot better now, this is mergable. I still expect build and test failures in the next few builds, mainly due to the changes in the API generation code. See also notes below.

Can you tell me what combinations of systems, APIs, and Python versions this has been tested already?

Some notes:

  • Additional, new .py files unnecessary, could all have stayed in mk_util.py. Perhaps move those new .py files to contrib/cmake as that's the only consumer at this time. Also, lots of unnecessary code duplication in them.
  • (some parts of) all APIs now seem to be built/included when some output-directory is set instead of is_*_enabled?! Unnecessary and unintuitive.
  • Big chunks of code update_api.py have apparently been moved with slight nor no modification, such that no diff tool will show whether there are no, minor, or major changes in behaviour. As a consequence we'll have to test all APIs on all systems to see whether something broke.
  • Contrary to file name and description, mk_consts_files.py seems to generate only consts.py instead of all consts/enums files for all APIs. Unclear whether this is an additional, Python-only "feature", or whether the Python API was the only "interesting" one so far.

@wintersteiger wintersteiger merged commit 7fd5042 into Z3Prover:master Mar 7, 2016
@wintersteiger
Copy link
Contributor

Style note: we use 4 spaces for indentation, the new python parts use 2 spaces, please update your editor to use 4.

@delcypher
Copy link
Contributor Author

@wintersteiger

This looks a lot better now, this is mergable. I still expect build and test failures in the next few builds, mainly due to the changes in the API generation code. See also notes below.

Thanks for merging :)

Can you tell me what combinations of systems, APIs, and Python versions this has been tested already?

The CMake build system was tested on

  • Arch Linux with python2 and 3 with clang and gcc using CMake 3.4.3
  • Ubuntu 14.04 with python2, gcc and CMake 2.8.12.2

The python build system was tested on

  • Arch Linux with python3 and gcc

Some notes:

Additional, new .py files unnecessary, could all have stayed in mk_util.py. Perhaps move those new .py files to contrib/cmake as that's the only consumer at this time. Also, lots of unnecessary code duplication in them.

Those extra scripts are necessary but that isn't obvious unless you look at how the CMake build system invokes them. The extra python scripts I've added are used as executable programs during the build (the CMake build system generates the files at build time, not configure time). To me this was the only sane way to generate the necessary files to build Z3 from CMake because the logic for generating those files is very complicated and reimplementing that logic in CMake seemed like a waste of time when instead I could refactor the existing python code so I could call it from a python script by passing the correct command line parameters.

By code duplication I assume you mean the def main(args):... stuff? That really is necessary because each script is a standalone program that can be invoked on the command line. Each script is slightly different and it wasn't obvious at the time if I could refactor this because I was learning what files needed be generated as I went along.

(some parts of) all APIs now seem to be built/included when some output-directory is set instead of is_*_enabled?! Unnecessary and unintuitive.

Which part of the code are you talking about? Do you mean the update_api.generate_files() method? If so you there is no _*_enabled() usage there because those functions are part of the python build system and I was trying to avoid as much coupling to it as possible. Unfortunately this isn't entirely consistent as there are uses of is_verbose() lying around but I didn't want to change too much in one go. If I had used the _*_enabled() functions I would have to start modifying global variables from my scripts which (IMHO) is horrible and should be avoided where possible.

I would say "unintuitive" here is very subjective. What I was trying to do here was have the ability to generate whatever API files I needed. The old code didn't really give you a choice, it always writes the .cpp, .h and z3_core.py files. At the same time I also needed to be able to choose where those files are written. The existing python build system just emits those file into the source tree but that is generally considered bad practice so I needed to control the directory those files would be created in. Therefore it seemed logical to have the generate_files() function to take arguments (rather than rely on global state) and have those arguments be the output directories for particular API bindings. If the directory is set to python's None then that would indicate that emission of those binding files is not desired.

The function could probably do with some comments though explaining this. Would you like me to add some?

Aside: The reason that writing generated files into the source tree (rather than the build directory) is considered bad practice is because it potentially prevents having multiple build directories that all share the same source directory.

Big chunks of code update_api.py have apparently been moved with slight nor no modification, such that no diff tool will show whether there are no, minor, or major changes in behaviour. As a consequence we'll have to test all APIs on all systems to see whether something broke.

This is unfortunate but necessary. It couldn't make progress with update_api.py without refactoring to be more like a callable module rather than what it was (a massive script).

Contrary to file name and description, mk_consts_files.py seems to generate only consts.py instead of all consts/enums files for all APIs. Unclear whether this is an additional, Python-only "feature", or whether the Python API was the only "interesting" one so far.

My intention is to add support for the other bindings at a later date when I (or someone else) has time to add support for them. I did the python bindings first because

  • The python bindings are the easiest to build/install (i.e. nothing to compile)
  • The Python bindings are also very popular so supporting them first seemed like a priority.

@delcypher
Copy link
Contributor Author

@wintersteiger

Style note: we use 4 spaces for indentation, the new python parts use 2 spaces, please update your editor to use 4.

Sorry about that. I'll try to fix that in the future.

@delcypher
Copy link
Contributor Author

@wintersteiger

One other thought I had. The bits of the Python build system that the CMake build depends on is quite small and I think it might be a good idea separate those parts into their own file mk_genfile_common.py (or some other name) to

  • Make it clear what things are actually used (and not used) by both the CMake and Python build system.
  • Further decouple the generation of generated files from the Python build system. I.e. in this prosed file mk_util would not be imported or used.

How do you feel about this idea? If you're happy with it I can go ahead an implement it and create a PR.

@wintersteiger
Copy link
Contributor

@delcypher: previous comment about adding comments: Always a good idea, I'd appreciate if you could add a couple sentences explaining the strategy.

Hmm, mk_genfile_common.py sounds like a good idea to get rid of some of the code duplication, but what else would it improve? Are you proposing to remove mk_util.py but still keep the rest of the old build system alive somehow?

What I would really is for someone to pick apart all the different APIs in update_api.py, which is a huge mess at the moment. Perhaps provide some common skeleton (like reading the annotations in Z3_api.h), but then have indenpendent code generation for each API, and in their own directories.

delcypher pushed a commit to delcypher/z3 that referenced this pull request Mar 7, 2016
@delcypher
Copy link
Contributor Author

@wintersteiger

@delcypher: previous comment about adding comments: Always a good idea, I'd appreciate if you could add a couple sentences explaining the strategy.

Done in #490

Hmm, mk_genfile_common.py sounds like a good idea to get rid of some of the code duplication, but what else would it improve? Are you proposing to remove mk_util.py but still keep the rest of the old build system alive somehow?

No the idea isn't to remove any code duplication. Can we just clarify what code you think is duplicated here?

As explained earlier by code duplication I'm assuming you mean the

  • mk_consts_files.py
  • mk_def_file.py
  • mk_gparams_register_modules_cpp.py
  • mk_install_tactic_cpp.py
  • mk_mem_initializer_cpp.py
  • mk_pat_db.py
  • pyg2hpp.py

scripts? These scripts just call into code inside mk_util.py, they don't duplicate any code in mk_util.py. These scripts provide a convenient command line interface to generating various files which the CMake build system uses to generate the files when necessary. Each script has a tiny bit of boiler plate to add a command line parser and then call code inside mk_util to do the work. There is a tiny bit of code duplication here but given how simple the main() functions are I think it's justifiable.

So at least from my perspective I haven't added any unnecessary code duplication. Those scripts need to exist so CMake (and other clients) have a way of generating various files from the command line.

What I'm proposing is to move the functions

  • mk_pat_db_internal()
  • mk_install_tactic_cpp_internal()
  • mk_mem_initializer_cpp_internal()
  • mk_gparams_register_modules_internal()
  • mk_def_file_internal()
  • mk_z3const_py_internal()

and data structures they depend on into their own file and decouple them from using anything in mk_util.py. The functions in mk_util that called the moved function will simply call them by importing this new file as a module.

The idea is to basically keep code that is used (directly and indirectly via the previously mentioned script) by both build systems separate from code that is only used by the Python build system.

The advantage is that then developers know when they are touching code that effects both build system or code that just effects the python build system.

This is quite minor so I don't particularly strongly about doing this but it was just an idea I had.

What I would really is for someone to pick apart all the different APIs in update_api.py, which is a huge mess at the moment. Perhaps provide some common skeleton (like reading the annotations in Z3_api.h), but then have indenpendent code generation for each API, and in their own directories.

I agree this is a bit of a mess right now. Perhaps you should open an issue about this so we don't forget. Is there any reason you don't use SWIG to generate some of the boiler plate necessary for calling the C APIs from various languages? Either way if you create an issue you should probably mention if there are any reasons to not use it.

@wintersteiger
Copy link
Contributor

No, I meant the command line args validation etc in the new .py files.

Moving those functions into their own modules is probably a sensible thing to do, perhaps they could also move closer to the files they relate to (e.g., mem_init* gparams* into the dll folder, z3consts_py* into api/python, etc).

SWIG never came up and there was never a need for it. The overall strategy has always been to use fewer tools than more, because of various cross-platform issues. Be warned though, if you dig into this pit, you may find yourself being the maintainer of some of those APIs. :-)

@jirislaby
Copy link

@delcypher Could you make Z3_INSTALL_*_DIR options? FWIW, we need to install to lib64 on x86_64.

@delcypher
Copy link
Contributor Author

@jirislaby It looks like CMake provides GNUInstallDirs.cmake which provides what you need and a whole lot more. I'll create a PR and ping you to test it once it's done.

delcypher pushed a commit to delcypher/z3 that referenced this pull request Jun 12, 2017
`contrib/cmake/bootstrap.py` script no longer needs to be executed.

The previous location of the CMake files was a compromise proposed
by @agurfinkel in Z3Prover#461. While this has served us well (allowing progress
to me made) over time limitations of this approach have appeared.

The main problem is that doing many git operations (e.g. pull, rebase)
means the CMake files don't get updated unless the user remembers to
run the script.

This commit only does the file moving and necessary changes to
`.gitignore`. Other changes will be done in subsequent commits.
delcypher pushed a commit to delcypher/z3 that referenced this pull request Jun 12, 2017
`contrib/cmake/bootstrap.py` script no longer needs to be executed.

The previous location of the CMake files was a compromise proposed
by @agurfinkel in Z3Prover#461. While this has served us well (allowing progress
to be made) over time limitations of this approach have appeared.

The main problem is that doing many git operations (e.g. pull, rebase)
means the CMake files don't get updated unless the user remembers to
run the script. This can lead to broken and confusing build system
behaviour.

This commit only does the file moving and necessary changes to
`.gitignore`. Other changes will be done in subsequent commits.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants