Skip to content

Could this project be run by a script build process such as scripty

Simon Hughes edited this page May 8, 2024 · 2 revisions

TLDR

EF6 = No, EFCore = Yes (see warning below).

  • EF6 uses Visual Studio to add/remove files from the project and is hooked into it completely. However, new style projects that use the SDK style .csproj format can use it.
  • .NET Core is different, you can generate the files and they get automatically picked up and included in the project.

Once I’ve added the ability to reverse engineer Oracle, MySQL and PostgreSQL. I’ll add a command line version you can call. The only parameter would be the <database>.tt file as all the custom settings are already included in the .tt

EF6

This project definitely needs access to VisualStudios' DTE so it can add/remove files from the project as things get generated, and/or filtered out. It sounds like scripty supports this via Roslyn, and not internally via DTE. If you build with scripty/Powershell, do you have to re-load your VS project every time because it was modified outside of VisualStudio? That would become very annoying very quickly.

You really need to code against the generated poco's & context during coding time. You don't want to really wait until build time to have code generated that you can code against. I don't want to have to wait to have code generated on every build. I want to reverse engineer once, and only reverse engineer if the database ever changes because it would slow things down doing it every time with scripty.

If it's a case of re-generating the code at compile time on a build server against a production database, then scripty could work in that circumstance. It also sounds like the generator would still be able to iterate through all the visual studio solutions projects to find the required connection string. However that could lead to problems if your code works on your dev machine, but not in QA/UAT/PROD.

The best way of working is to generate what you need upfront, code against the context + pocos you have, unit & integration test them, and then push them to the build server. Build server builds your code (with no hidden changes by scripty) and then on to QA/UAT and finally PROD.

Warning

Whilst scripty sounds good in principle, I would hate the situation where something is failing QA and I can't find out why because scripty may have done something different in the build server to my dev box, and I can't debug the error because I can't reproduce it. Therefore if there are code changes by scripty on the build server, that means you will have to automate adding those changes to source control on your build server. So your dev procedure would be: code, unit test, run code and test, push to build server, wait for the build, and pull back any code changes. If there are code changes, unit test, run code and test, then push to build server knowing that works ok, release to QA. Hmm, a bit of a smell. I for one would not like to work like that.

So the only way to work with it is not to automate it at build time, and not to automate anything at the build server either, but to manually run scripty up front during development, re-load the project, and continue on from that point. Which is how we work with this project, without having to re-load the project. I really like the idea of this project, it would be ideal for code analysis tools, etc. I just think the scripty experience would not be as slick as it is now due to having to reload the project.

What are your thoughts on generating code outside of Visual Studio, or on a build server? Having a build potentially fail due to reasons such as a database patch missing from the build server database, or worse, a build works on the build server (there may be new columns added to a table), so builds ok, but the QA database does not have these new columns and fails when SaveChanges() is called. In my view, builds should always work precisely on the tagged code pushed into source control.