-
Notifications
You must be signed in to change notification settings - Fork 11
Encapsulate LigGitSharp and expose our own abstraction layer for the bits we need to. #3
Comments
And here is a related gist on this topic, explaining some work done in GitReleaseNotes to encapsulate LigGitSharp and expose a |
@JakeGinnivan - I'm struggling to understand the following class - is this fairly useful / helper method functionality that can be placed in the core do you think? Would be greatful if you could help me understand it's usage a bit more? |
I've gone slightly off track and started to create a fluent API for git, over here: https://github.com/dazinator/FluentGit I've got some simple tests passing. My aim is to go through the tools, and look at the git functions they need. i.e cloning / fetching / examining tags, examining remotes / examining branches / committing etc. I'll be looking at surfacing those functions in FluentGit over time. |
|
Cool, you should share with the libgit2sharp guys What I was thinking is more a facade over git for our uses in GitVersion. For example:
Then we would have a switching layer which would pick the best implementation, to start there would just be Then when the build server checks out a single commit we can just hit the github API's to get the branch/tag/commit info we need for gitversion to run properly and we don't need to do the dynamic repo stuff at all. The dynamic repo stuff would then be moved behind the LibGit2Facade because it is an implementation detail of 'the repo gitversion is running in is not a complete repo, so we are missing information' |
We might even be able to use Assuming libgit2sharp supports it. |
Where would you like to see the documentation? |
@GeertvanHorrik for this interface? In the GitTools.Core repo But basically the idea is to look at the git usage across GitVersion, GitReleaseNotes, GitHubReleaseManager and create a facade which hides us from the details of what needs to happen |
I mean: @dazinator requested docs how the dynamic repo stuff works. Is it worth creating a GitTools-docs or will we also put that on readme.io? |
So the facade would not have concepts of cloning, current state/working directory etc. It should just be a way to query metadata and maybe fetch the content of a blob (so we can fetch config files or something). |
@GeertvanHorrik ah ok. Just throw in the wiki for the moment |
@GeertvanHorrik - thank you for that. There are some additional things happening though that I'd be really grateful for any insight with as my git knowledge isn't the best (i'm working on that!) After the bare clone is made, and before the
After the Fetch is done, but before switching to the correct branch:
I am wondering if this logic needs to be revisited before its pulled into the core verbatim. Some questions are:
|
I had no idea github offered an API for querying repo information - doh! One that gives an alternative to using libgit2sharp then, - but obviously only for github repos! So your facade idea now makes complete sense to me, however - what's the advantage you see in having multiple API's to implement (libgit2sharp, github api etc) over cloning and always using one api (i.e libgit2sharp)? |
A 1/2) Ensuring the remote is just to get the right remote (pr / branch / whatever). The refs are required for pull requests B 1) A bare clone only clones the default branch, we need to ensure all branches we need are avialable C 1/2) I don't think we check for 1 remote. I think we search for the right remote. |
@GeertvanHorrik - this is the ensure remote code I took from GitReleaseNotes - seems to throw if not 1 remote.
|
|
Definitely not my code because it involves TeamCity, I don't use that ;-) |
haha ok - sounds like i shouldn't use this code verbatim then :) |
I had some hand in this here as well: https://github.com/ParticularLabs/GitVersion/pull/262/files This was based on conversations with @nulltoken. These issues were found when implementing support for AppVeyor, but it was decided that they could be brought into the core GitHelpers as well. |
Performance, we don't need the blobs from the repository we just need the metadata (refs etc). If we optimise for GitHub then when using github it will be super fast with no cloning/disk IO. This will be especially handy for things like https://github.com/JakeGinnivan/GitReleaseNotes/tree/master/src/GitReleaseNotes.Website which we could host on azure or somewhere, and saving the bandwidth costs for GitHub could make a difference if they get popular |
When building pull requests it is a hidden ref, it is not under |
A dynamic repo is created per repo url, so if that changes it will be a fresh clone |
I think this only is checked if we are running on a build server. Running locally in a real repo should not do any fetching or these checks |
Nice. But just to play devils advocate, wouldn't performing ongoing fetches on a repo (and then querying the repo directly locally) be faster / more efficient than querying accross the network i.e to github to get stuff every time by their api? I guess my point here is isn't there some kind of tradeoff between a local cache with incremental updates (i.e pay a high price initially to populate the cache) versus continually accessing the same info accross the network (info that could have been held locally for example) |
|
Any blobs should be accessible via the working directory rather than the git repo. And we have to fetch the metadata everytime, yes, but the payloads should be small. We still have to do a fetch which will include the changes to blobs which will likely be much larger than any API request. |
Focusing back on remote repo support, in the core.
@JakeGinnivan thanks for pointing me in the right direction there, I've educated myself a bit more fully on that topic now :) I'm now thinking about how the core will fetch optimally - i.e only the branches / pr branches that each tool needs. For example, currently, we ensure the remote has ref specs for all branches - but then doing a fetch could be wasteful if the tool in question only actually only requires a single branch to be fetched - so perhaps this doesn't belong in the core. I'd hazard that each tool will need total control over the refspecs used, and fetch operations. I'm thinking it would be better to:
Something along those lines.. |
The issue is that say for GitVersion, if you are on develop we need to fetch all tags and master. With feature branches we do inheritance of config based on the source branch. Without all branches, not sure we could do that? If we could work remotely with refspecs without cloning/fetching that may work? |
Sorry for my absense, had a few deadlines. Going to pick up GT again this week. What's the status of this feature? Is it manageable for the next release? |
I've not had any further time to look at this either - apologies. I was going to go through each of the tools to list out, how it uses interface IGitCoreFacade
{
IEnumerable<BranchInfo> GetBranch(GitCoreContext context);
IEnumerable<CommitInfo> GetCommits(GitCoreContext context);
IEnumerable<TagInfo> GetTags(GitCoreContext context);
string GetTextFileContent(GitCoreContext context);
} I don't think I'll be able to start on this for a while, so if someone wants to try to make some headway with it then feel free.. |
As GitVersion stopped using |
I think we are all agreed this should be a long term goal for the core (correct me if I am wrong)
I'm not sure if it's better to do this for version 1, if we do it for version 2, we will potentially have to refactor all the tools multiple times (once for version 1, and again for version 2)
Perhaps it would be better to have this in the initial release - as this will no doubt form a major part of the public API for GitCore? What do you think?
The text was updated successfully, but these errors were encountered: