Practical Git and GitHub
Practical Git and GitHub
Dinis Cruz
Buy on Leanpub


Leanpub book, originally based on Blog posts

This book started with the Git/Github related blog posts published on my blog and eventually evolved into the book you have in your hands (or eReader).

The idea to use my blog posts was to:

  • kickstart the creation of this book (and provide a number of chapters that could be further improved)
  • capture the multiple experiments and learning curves that I experienced (while learning how to use Git/Github)
  • provide a ‘real world’ point of view on how to learn and use Git/Github

The theme of this book is ‘Practical’ (hence the name ‘Practical Git and GitHub’), and what I am trying to do here is to show how to use these technologies in practical situations, specially how to apply them to solve real problems. This means that sometimes the end result is a bit messy, with lots of rabbit-holes/tangents followed (and un-followed). But that is how it works. When learning a new technology, what is really important is to experiment a lot, and see how it works for the problem we have at hand.

Notes about current structure

At the moment, the chapter order is the one created by the original ‘import from blogger’ (i.e. by publish order). A better idea might be to create logical groups, so that the posts are ordered based on some user-friendly criteria (still to be defined)

At the moment I’m working/thinking about the best way to structure this content, and how to present it in a easy to read/consume format.

It would be great if you (the reader) could provide some feedback on the book’s structure, for example:

  • if you think the book’s order should be different (or chapters renamed)
  • there is content missing (really important to cover in an Git/Github book)
  • a particular chapter is not very clear, easy-to-understand or relevant
  • etc…

About the Author

Dinis Cruz is a Developer and Application Security Engineer focused on how to develop secure applications. A key drive is on ‘Automating Application Security Knowledge and Workflows’ which is the main concept behind the OWASP O2 Platform and the FluentSharp APIs.

After many years (and multiple roles) Dinis is still very active at OWASP, currently leading the OWASP O2 Platform project and helping out other projects and initiatives.

After failing to scale his own security knowledge, learned Git, created security vulnerabilities in code published to production servers, delivered training to developers, and building multiple CI (Continuous Integration) environments; Dinis had the epiphany that the key to application security is “Secure Continuous Delivery: Developer’s Immediate Connection to What They’re Creating”. This ‘Immediate Connection/Feedback’ concept is deep rooted in the development of the O2 Platform, and is something that will keep Dinis busy for many years.

Change log

Here are the changes made (per version):

  • v0.41 (Jan 2016)
    • Fixed bug where only October-2012 files were published
  • v0.40 (Jan 2016)
  • v0.12 (April 2014)
    • renamed number of files to have the name of the post (with spaces) and with MD extension
    • added file
  • v0.10 (March 2014)
    • First release of book with raw import from blogger posts (no formatting or editing done)
    • Created Git repo on local dropbox sync folder
    • created github repository for this book:
    • Added this change log

1. October 2012

  • Using a Git Branch to fix a Bug
  • Git and GitHub commands to create and deploy new version of TeamMentor
  • Idea: Sync Blogger Posts with a GitHub repository
  • Approving a GitHub Pull Request Workflow
  • Deploying TeamMentor to AppHarbor (.NET Cloud) using Git push
  • The need to create forks/clones for website
  • Using Git Branches to deal with the multi-config variations of TeamMentor
  • Handling content changes made on hosted site created by Git clone (with auto Git commits and pushes)
  • Going back in time using Git’s checkout
  • Adding Tags to TeamMentor Master repository
  • Creating the final TeamMentor with SI Library repository via multiple Git pulls and pushes

Using a Git Branch to fix a Bug

On Git I don’t tend to use branches for big code changes, and the only time I tend to use them (on development) is when trying new code changes or fixes.

For example I was working last night on a CRSF fix for TeamMentor (more on that later) and I created a new branch using ($ git checkout -b crst_test) which contained a number of temp commits, like the ones that selectively disabled some Admin security demands so that I could debug the issue better :) : {width=512px)

As you can see by the Commits, I did a number of changes that where only pushed to the crst_test branch ($ git push origin crst_test:crst_test) and not be propagated to the main code base.

Then then once I finally found the issue and fixed (see last commit), I went back to the the main branch ($ git checkout master) added my fixes, commit them and pushed them to the master branch ($ git push origin master).

At the moment this is what the main repo looks like: {width=612px)

The final step is to remove the local branch ($ git branch -D crst_test) and the branch from GitHub ($ $ git push tm_master –delete crst_test), which after GitHub recreates the graph (sometimes it takes a couple minutes), it looks like this: {width=712px)

Git and GitHub commands to create and deploy new version of TeamMentor

With the new Release of TeamMentor, I needed to upgrade the main GitHub repositories to the new version. This is the detailed version of the Git and GitHub workflow used.

My objective was to create a set-up where I could push the new code base (3.2) while keeping a clean separation with the previous one(s).

My solution was to use branches to hold each version (yes I could have used tags, but I prefer the isolation provided by branches).

The actions taken are:

  • Identify the repository with the latest version of TeamMentor (the Code)
  • Identify the repository with the TeamMentor Library to use (the Content)
  • Identify the repository to upgrade (with both the Code and Content)
  • Download the repository to upgrade and store it on a separate branch (the Backup)
  • Download the Code repository
  • Download and merge the Content repository
  • Publish changes

More specifically (using some Git commands)

  • Set up tm_master remote with the latest version of source-code (for example):
  • Set up tm_library remote with the TeamMentor library to use (for example):
  • Set up origin remote with the version to upgrade, which is the Code+Content version (for example):
  • pull origin into master
  • move master branch into master_Old.Version (for example) master_3_1
  • create a new ‘orphan’ branch called master (and delete all files from it)
  • pull tm_master into master
  • pull (using tree merge strategy) tm_library into the Library_Data/XmlDatabase/TM_Libraries folder
  • push all branches to origin

Step-by-Step workflow:

1) Create a local clone from the version to upgrade and back it up:

In this this case from

Once the repository is created locally, zip it up and upload it to the GitHub repository download section. It is important that this is the version with the .git folder (and not the .zip you can download from the repository main page). The fact that we can easily create a full backup of a complete Git repository is one of my favorite features of Git, and it allows for an easy restore for the inevitable mistakes that will be made :)

Tip: make sure to test the download, since I’ve seen cases where the upload fails and even if there is a download link on the page, there is no file downloadable:

Once you’re happy that you have a backup, you can delete the folder locally created (the one you zipped)

2) Create a local repository with new version:

Start by creating a folder to hold the files (in this case E:\teammentor_TM_ReleasesTeamMentor_with_OWASP_Library)

Open a git bash and cd into that folder

Execute the following commands in sequence:

After completion the target folder should look like this:

To quickly test TeamMentor, double click on ‘start TeamMentor.bat’

3) Test rolling back to an earlier version:

What is really cool with this set-up (and the power of Git), is that it is very easy to change to an earlier version. Leaving the webserver on (which is running from the ToolsCassiniDev_4.0.ex folder), go to the git batch and run the command: $ git checkout -f TeamMentor_3_1

What this will do is to replace the current files in the target dir with the files from the TeamMentor_3_1 branch (which are the files from the repository we are upgrading).

This is what the file system looked like before the : $ git checkout -f TeamMentor_3_1 command

And this is after

Can you spot the differences?

I can’t overstate how powerful this is, specially due to the speed that it happens. What we are looking at here is a virtual file-system based on a Git database. It was this feature that really made me have a massive paradigm shift, and realize that Git gives us a Version Control File-System (maybe one day we will have an OS-Level git based file-system).

And just to make sure that we did change into the previous version of TeamMentor, refresh the browser and reopen (which will be the 3.1 version of TM)

Now go back to the Git Bash, switch to the master branch (via $ git checkout -f master) and refresh the browser:

And we’re back to version 3.2 (note the version number on the top right, under the ‘Sign Up’ link).

I have to say that I never get tired of seeing this ‘virtual file system created by branches’ in action. Try the checkouts a couple times with a windows explorer window open (so that you see the files change in real time).

4) Confirm that both repositories have merged successfully:

A key part of the structure created is the ability to have a merge of two Git repositories (one with the Code and one with the Data) while keeping track of their commit history (and allowing pulls of new content/data).

This is achieved using the Git subtree merge strategy, which can be seen in action if you run the $ gitk command (note: I did a $ git stash first to remove a couple temporary files created by the TeamMentor engine).

On the Git Commit tree shown above, the dots on the left column are from the tm_master repository (the Code) and the ones on the next column are from the tm_library repository (i.e. we have the Commits from both repositories :) )

5) Push changes to the main repository:

The final step is to push the changes (of both branches) to the origin repository using:

$ git push -f origin master:master and
$ git push -f origin TeamMentor_3_1:TeamMentor_3_1

After completion we can double-check that the master branch is now the 3.2 version:

… and the TeamMentor_3_1 branch is the 3.1 version:

6) Download the zip file and check that everything is okay:

As a final test, download the zip file from the repository master branch home page:

Extract the zip file somewhere on your disk, stop the running Cassini server (if it’s still running) and double-click on the ‘Start TeamMentor.bat’ file:

And you will see a clean 3.2 version of TeamMentor running locally:

Idea: Sync Blogger Posts with a GitHub repository

From the end of the So if my blog account is compromised can I sue Google? post comes an interesting WebService idea:

Sync Blogger posts with a GitHub repository

The idea is to backup the contents of a blogger account into a Git repository hosted by GitHub, which would give it version control and reusability.

In practice this shouldn’t be that hard:

  • Subscribe to RSS feed (starting with the big XML export that Blogger already provides)
  • Create Git repository locally with ability to:
    • Push to GitHub
    • Download
    • Pull directly
  • There needs to be some thinking on the best way to organize the files on the git repository
  • It would be really cool if the files could be stored in a way that they could be consumed by other tools (like TeamMentor or )

Approving a GitHub Pull Request Workflow

I just received a GitHub’s Pull Request from Roman for some new content that he added to TeamMentor’s Documentation Site.

Here is the workflow I used to approve this request using GitHub’s web-based workflow:

1) Receive GitHub email alert:

2) Go to GitHub and see the Pull Request there:

The main page ( gives us a nice overview of the Pull Request:

Here are the Commits:

On the Files Changed tab we can easily see (in colours) the proposed changes.

For example, here is a change to the main TM Library xml file with a couple new articles added:

Here are a couple lines removed and some added:

This is a new file:

These are a couple metadata changes:

3) Approving the Pull request

Going back to the first page (the Discussion tab) the most important part of this whole process is the green bar that shows that this Pull Request can be merged ok

This basically means that there are no conflicts between the new changes and the current content. When this is not possible (and you get a red bar), the best thing is to do this ‘manually’ (i.e. via a git bash on your local box)

When you click on the ‘Merge Pull Request’ button you get this confirmation request:

And clicking on ‘Confirm Merge’ will do the commit and close this Pull Request:

Deploying TeamMentor to AppHarbor (.NET Cloud) using Git push

Now that AppHarbor supports git publishing (i.e. they create a git repository for an ‘AppHarbor Application’), it is very easy and fast to deploy a new version of TeamMentor.

First step is to go to and create an application (for this example I’m calling it ‘Testing-AppHarbor’):

Once the application is created, click on the bottom-left ‘Repository URL’ button, which will copy to your clipboard the AppHarbor Git url:

In this example it was:

Next open an GitBash on the repository you want to push and execute:

$ git remote add appharbor_test

$ git push appharbor_test master

After a bit (depending on your upload speed) you should get a Git push message

Back in the AppHarbor website, the application page should look like this:

You can click on the icon under the ‘Status’ column (the one animating) to see the current status of the build (note that sometimes AppHarbor takes a couple minutes to trigger the compilation process)

The ‘Details’ link can be used to see the MSBuild compilation log (very useful when the compilation fails)

Once the build and deployment is done, if you go back to the Application page you should see that your build is now Active

And clicking on ‘Go to your application’ will take you to the website you just created

By default there are no libraries installed from a TeamMentor/Master clone/copy, but this is easily solved by:

  • logging in as Admin,
  • going to the Control Panel
  • choosing the **Advanced Admin Tools **option
  • using the Install/Upload Libraries tool.

For example, click on the ‘Top 20 Vulnerabilities’ link to add the Library hosted at GitHub’s TeamMentor/Library_Top_Vulnerabilities (TeamMentor engine will go to that repository, download the ZIP file and install it)

Once that is done, click on ‘Open Main Page’ to go back to the main TM Gui where the ‘Top 20 Vulnerabilities’ Library is now installed:

Auto deploy on GitHub Commit

So far we talked about how to push an git repository into AppHarbor from your local disk, but that is not the only way you can do it.

AppHarbor also supports GitHub Service Hooks which can be configured via the GitHub’s repository admin panel.

For example, here is how I use AppHarbor to create a new deployment every-time I do a Git Push into the main TeamMentor/Master repository (which is very useful for QA and Testing)

The need to create forks/clones for website

(Here is an email I sent earlier today at SI, that covers a number of interesting challenges that we’re now having with TeamMentor, and how Git/GitHub can help)

One of the scenarios/problems that is starting to happen is ‘how to manage the specific requirements for deployed sites like who need custom changes?’

So here is a description of ‘the problem’:

a) there is a master version of the code:

b) there is a master version of the content:

c) there is a master version of the code+content (which is a ‘virtual copy’ of the two above): (think of this as a copy of the code and master repositories in a way that keeps the git connections and commit history)

d) there is a website that is based on the code + content version (i.e. , BUT has a number of specific requirements.

e) how to deal with changes to , that are only relevant to that site (i.e will not be propagated to the main Code+Content repo)

  • interestingly in the case of site, (at least at the moment), the only changes will be on the Code (where the content is the same as the one in
  • but I can see how as our content generation capabilities improves (and we start to have fresh and ‘current/recent-events’ articles) we might want to push those into (since they will be the best advertisement for TM that we could ever get)

f) since is a ‘read-only’ website, there is also an argument (from a security point of view) that that site should not have the advanced editing capabilities that TeamMentor has (this is also a feature that I can see customers wanting)

So what is the solution?

At the moment, the solution that I see is to:

Any other ideas on how to deal with this?

Note that we have the exact same issue (or a variation of this) on:

Using Git Branches to deal with the multi-config variations of TeamMentor

Here is an interesting problem that affects TeamMentor (TM) and just about every other app:

“How to deal with the specialized versions of an application that are created via Config changes”

Keeping this simple, and only dealing with the config changes that can be made by modifying the TmConfig.config file, TM already (today) has the following scenarios to support:

  • Default install (with default settings)
  • Anonymous users cannot see the content (with a variation where anonymous users cannot see the Library View (not done via config change))
  • Redirect all Http traffic to Https (i.e. SSL Redirect)
  • Windows Authentication enabled/disabled
  • SSO enabled/disabled
  • Change location of TM Libraries
  • Enforce HTML Sanitisation on Article content
  • Change default admin pwd

There are also other scenarios that I’m sure TM will need to support very soon:

  • Read-only version of TM
  • ‘Secure / lock-down’ version(s) of TM
  • OAuth integration
  • Support for 3rd party data sources (like the PoC done where wikipedia, msdn and owasp content is consumed by TM natively)

I was thinking about the ways to solve this, and here are a couple options:

  1. Rely on user documentation and pass the responsibility to ‘apply the changes correctly’ to the customers/users (I don’t like this solution, although is what most vendors do (including SI))
  2. Create forks that apply the required changes only to those repositories (this is what we currently do in the forks we maintain)
  3. Make changes on specialized Git Branches, and use Git Checkout to enabled them (hum…..)

As you can see by my comments:

  • I really don’t like option #1,
  • option #2 is what we currently do on some cases (and does have the side effect of fork-explosion), and
  • option #3 is an idea I have been thinking for a while, and the more I think about it, the more I like it.

Basically the idea would be to use Git Branches to track/apply those config changes.

Currently we use Git Branches to hold references to past TM versions (like we do at the moment in the ‘master TM repositories’). Note that the Git Branches used for special dev tests on Dev forks would not be affected by this.

What I really like about this idea is:

  • it would put the responsibility to create those ‘variations’ in the most capable hands (i.e. the ones who know the code best)
  • it would allow for a strong QA cycles and for much better support for those scenarios
  • it would make life easier for customers/users
  • it would provide a scalable way to support complex configuration changes (i.e. scenarios that require more than a couple ‘settings-changes’)

What do you think?

Any other ideas on how to deal with this issue?

Handling content changes made on hosted site created by Git clone (with auto Git commits and pushes)

Here is the next interesting TeamMentor (TM) and GitHub problem to solve:

“How to deal with content changes made using TM’s online editing capabilities”

Here is the ‘Problem description’:

  • TM is published from a Git Repository into a Server (let’s call it from REPOSITORY A to SITE A)
  • There are two deployment modes that already work well:
  • Publish to cloud (Azure or AppHarbor): takes about 2 to 5 minutes
  • Publish to EC2 server: takes between 10 to 30 minutes and includes custom DNS and IIS set-up/deployment
  • In either mode the idea is that REPOSITORY A is the master version of the Code+Data
  • This means that if we needed to rebuild that site, we could (ie. should) be able do it in minutes
  • Upgrades and patches are made via a simple git pull (which gets the latest version from the REPOSITORY A) and no git merge activity should be needed
  • But what happens when there is a content change on SITE A’s files?
  • An automated solution is needed, since the option of ‘RDPing into the server to do the commits/push’ or ‘trying to do the commits/push via TM’s GitHub interface’, not only don’t scale but are as dangerous as replying on manual backups.

Here is what I have in mind:

  • TM detects if git support exists on the deployed server (i.e git.exe is available) and:
  • Git checkout (the deployed branch) into a special ‘live_server’ branch
  • Auto git commit on every TM Content save (or creation) with the commit message being a mix of: Current user, its IP, the date and time and the file affected
  • (if configured) auto-push the change to an GitHub repository
  • This could be done by configuring an SSH key on the server, or by hardcoding the GitHub credentials into the ‘Git remote’ value
  • If git.exe is not available, then these commits and pushes will need to be done manually (by zipping the whole content folder and moving it into a location with git.exe available)
  • In terms of the repository to push, I think that we shouldn’t push directly to the original REPOSITORY A (the one used to create SITE A), but we should push it to a fork/clone of REPOSITORY A which could be used as a staging location (one where Pull Requests into REPOSITORY A would be made)
  • Of course that we could push into REPOSITORY A directly, but that would expose an account with git push privileges to an important repository, and could create a scenario where unauthorized changes were made into one a production repositories (also note that with git push privileges, it is possible to completely remove all history and commits from a repository (in effect deleting all information)).

What I like about this solution (the auto-commit with option to auto-push) is that it will provide TeamMentor with a state-of-the-art version control solution (at article level).

Every single change would be logged, and although this will most likely make that branch completely unreadable by humans we will be able to have really powerful (and cool) per-article version history (ie. for each file, see its complete change log, including ‘who did the change’).

Note that it is possible to combine a number of commits into one (not in GitHub, but in git.exe) so I think that for cases where a number of files were changed, we might want to consolidate them into larger commits (specially when pushing those changes to ‘user consumable’ repositories)

What do you think?

Any better ideas?

Going back in time using Git’s checkout

I’m still amazed at Git’s speed in moving back and forwards in time. For example I was trying to find a particular GUI that we created for TeamMentor and was able to use git checkout to look at previous versions:

git checkout master (25 Oct 2012 TM 3.2)

git checkout dd867bfb4b9519c3b9c6ddfe2c0f9b1f6720f162 (4th Sep 2012: TM 3.2 RC1) :

git checkout 890caa053feee04bf0b7139787e0ee6100963771 (23rd Jan 2012: TM 3.0) :

git checkout 557177691139bf2385973b45bf39508042a11621 (18th Jan 2012: TM 3.0 RC9) :


Looking at these images, I thought of a cool script to write (here it is in pseudocode):

 1 foreach id in avaiable_checkouts_
 2 {
 3        git checkout id    
 4        {  
 5           start webserver  
 6            open default page in browser (if possible 'add a library if not there')  
 7            take screenshot (only store unique values and if possible 'add watermark with vers\
 8 ion and date')        
 9            close web server  
10       }  
11 }      
12 create animation from screenshots taken _

It’s all doable with O2’s APIs (I just don’t have the time today) :)

Adding Tags to TeamMentor Master repository

With 3.2 out, its time to add some Git Tags to the main TeamMentor/Master repository (which at the moment has none):

In a local Git Bash of this repository, we can create a tag using $ git tag -a v3.2 -m ‘3.2 Release’

Next we push that tag into GitHub using $ git push tm_master v3.2

And if we look back in GitHub’s Tag page, we will see that our v3.2 tag is in there:

At the moment we are keeping track of the previous versions using Git Branches (but I think that tags will do a better job)

For example here is 3.1 release (with the f71b016241 id)

We can use this ID value to create the 3.1 tag

Use gitk to find the SHA1 ID of the 3.0 release

Which we use to create the 3.0 tag:

After pushing to GitHub, the Tag page looks like this:

What is really cool about these Git Tags is that they also provide a nice location to download a particular release :)

Creating the final TeamMentor with SI Library repository via multiple Git pulls and pushes

This is going to be a long one, so if you are interested in seeing Git and GitHub in action in a real-world application, grab a coffee/tee/beer and read on :)

This is the scenario at the start:

  • We need to create the final TeamMentor 3.2.3 package for release
  • The TeamMentor_SI_Library repository is at version 3.2
  • The TeamMentor/Master repository is at version 3.2.3
  • The Library_SI has a couple content changes since the last pull
  • We need to do a pull of both TeamMentor/Master and Library_SI into TeamMentor_SI_Library and push the final result to GitHub

For background information on the current TeamMentor git architecture, take a look at:

It all starts with a pull request from Roman the last content changes to be added to Library_SI

Which I opened

Quickly reviewed the changes

And Confirmed Merge

This will close the Pull Request

With changes now part of the main Library_SI repository

For reference here is the network graph of Library_SI

Next we move into the TeamMentor_SI_Library:

Opening the local copy of the TeamMentor_SI_Library , as the log list shows (below) we are 1 commit behind the version at GitHub:

Let’s create a new branch to do the updates (just in case)

And confirm that we are also not in sync with the TeamMentor/Master repository

Here are the current remotes (in TeamMentor_SI_Library):

Here is gitk visualization of the TeamMentor_SI_Library commits

Here is GitHub visualization of the TeamMentor_SI_Library commits

We’re now ready to do the pulls (fetch+merge), and let’s start with the TeamMentor_SI_Library (origin remote)

Next lets do the TeamMentor/Master (tm_master remote)

A quick look at git shows how these multiple commits are being nicely merged together

Finally lets do a git pull on the Library_SI (tm_library remote)

With gitk now looking like this, which is a pretty cool graph:

A look at the file system, shows that it looks as expected

So let’s quickly start cassini to take a look (via the ‘start TeamMentor.bat’)

Hummm… on load there was a problem with the right-hand side panel

The error was this one (which is the first time I saw it)

Luckily a quick google search, revealed these articles:

Which pointed me to the fact that I hit a weird ‘time bug’ that happens when the clock moves due to Daylight Saving Time.

Next step is to try the AppHarbor (Cloud) deployment, so I went to AppHarbor and copied the target Application Repository URL

Which I added as a remote, and used to push the content

This built ok, but when I deployed it, I noticed that the site was still on version 3.2 (instead of the new 3.2.3)

Back in git, I realize that I had pushed the wrong branch, and bellow I’m pushing the 32_Final_Update branch into the AppHarbor master branch (which is automatically built)

With the correct version ready, I deployed it:

And here she is in action:

The reason why there is no content is because AppHarbor will only copy to the live servers the files it can find using the VisualStudio solution, which means that the Library files where not there.

To make the test realistic, I zipped the TM_Libraries folder

And used TeamMentor’s Control Panel to upload the file:

And install it:

After that, the home page looks like it should:

Testing the need to login to see the Article’s content:

Next step is to push into the TMClients/TeamMentor_SI_Library repository this latest version

But not before we update the local master branch with the 32_Final_Update branch

Next we replace the origin remote mapping with TMClients/TeamMentor_SI_Library and push into it:

After all these steps is good to take a look at the GitHub’s network graph and confirm that all looks as expected:

As a final confirmation, let’s download the zip file from the TMClients/TeamMentor_SI_Library and make sure it is all good (this is the file given to customers via a password protected zip file)

Once the file downloads, unzip it to a temp folder:

And use ‘start TeamMentor.bat’ to run it locally:

Final comment:

If you actually look at the workflow we have here, this is pretty powerful stuff!

We were able to have two complete separate activities (development and code changes) done in completely different timings, to be combined into a single package (preserving all history), that can then be delivered to customers (who don’t care about the multiple repositories).

In a way it is just like doing copy and paste of the two source repositories into a ‘release folder’, but in a way that we have the full (independent) git history (check out the graphs to see how the git commits from two separate repositories are correctly preserved) and can be easily updated/synced (it took me a LOT more time to write this post than to do the actually pushes and pulls :) )

What I also like about this workflow is that it works :). We are now doing the 2nd release using it, and it is surviving the real-world acid-test.

But aren’t there a lot of ‘manual’ steps that be automated? Yes, yes there are!

But I’m a big fan of:
- first figure out the solution for the problem (in a “let’s make it work” kinda way)
- then automate as much as possible the workflow that ‘works’

It’s a mistake to automate too soon, specially when the understanding of ‘the problem’ and all its moving parts is still not very high.

UPDATE: I just did the same process for the version (the TeamMentor Eval), and it took less than 5m to do it :)

Remotes (of my local clone)

1 origin (fetch)  
2 origin (push)  
3 tm_library      git:// (fetch)  
4 tm_library      git:// (push)  
5 tm_master (fetch)  
6 tm_master (push)  

Commands executed:

1 $ git checkout -b 32_Final_Update  
2 $ git pull origin master  
3 $ git pull tm_library master  
4 $ git pull tm_master master  

(there was a small conflict on the About.html page which is different on the eval version)

1 $ git checkout master  
2 $ git merge 32_Final_Update  
3 $ git push origin  

Network Graph:

2. November 2012

  • Pretty cool visualization of the ‘GitHub based’ TeamMentor Development+QA+Release workflow

Pretty cool visualization of the ‘GitHub based’ TeamMentor Development+QA+Release workflow

Roman created this nice Visio diagram with out current ‘GitHub based’ workflow for TeamMentor:

Although this is still a draft, it is already a good representation of how we are using Git’s powerful forking/cloning capabilities to implement solid Development + QA + Release workflows.

Roman is also creating simpler versions of this diagram, for:

  • the cases where we push a patch on a released version (for example targeted bugs or security fixes)
  • the workflow used to manage the multiple TeamMentor’s Content repositories

Let me know if you have any ideas or suggestions on how to make this even better :)

One area we still need to do some work (and help), is the automation of some of these steps/workflow. We have figured out the git commands, and now we need to automate their execution.

3. December 2012

  • Comparing two GitHub Issues List
  • Using TeamCity to build on Git Commit, deploy to AppHarbor and open browser
  • Minimum required files to run git.exe on windows (for clone, push and pull)
  • Rewriting Git History (locally and at GitHub)

Comparing two GitHub Issues List

Is there a way to compare two GitHub Issues lists?

What I need is a programatic way to compare the items that exist in two GitHub repositories (repo A and repo B), do a diff, and list the ones that don’t exist in repo B (i.e. which ones were not copied from repo A to repo B)

Recently we moved the TeamMentor issues/bug list into the public repository:

Before (when started to use GitHub for TeamMentor development) we used the private repository

When we did the switch, we manually moved a number of issues into the new public repository, but I want to make sure we don’t lose anything (since these issue’s list are also our brain-dump of ideas for future releases)

So, any tools or services that currently do this?

Using TeamCity to build on Git Commit, deploy to AppHarbor and open browser

I just gave TeamCity a test drive and I really LIKED it :)

After using it for a bit, I was able to create a really nice CI that:

  • Monitors the local file system for a Git Commit (of TeamMentor)
  • On Git Commit, trigger a build (of the main VisualStudio 2010 project)
  • If the build is OK, trigger a git push into AppHarbor
  • Open a WebBrowser with the AppHarbor site

All automated from the moment there is a commit :)

Let’s see this in action

It all starts with a deployed instance of TeamMentor at version Dev.2.1 (see top right)

With TeamCity running locally on my box (as a service) keeping an eye on the local git repository

In VisualStudio, lets make a simple change (bumping the version to Dev.2.2)

Using Git Source Control Provider VisualStudio Extension, create a Commit

Which (after about 10 to 20 secs) is picked up by TeamCity

The ‘build logs’ update in real-time, and provide a lot of good info:

After the build completes, and the AppHarbor push is done, the target website will be opened: (notice that it is now on version Dev.2.2)

For reference here are the Build Steps in TeamCity configuration page:

Minimum required files to run git.exe on windows (for clone, push and pull)

I want to add native Git support to TeamMentor (and O2) and don’t want to ship the entire git folder structure that is installed with msysgit

I found the minimal required files to just use git-clone post that implies that the only files that are needed for a git clone are :

  • git-clone.exe
  • git-fetch-pack.exe
  • git-index-pack.exe
  • git.exe
  • libcurl-4.dll
  • libiconv2.dll

Is this correct?

What about for git pushes, pulls and commits?

I’m sure I saw this on a tool I used the other day (which had git support), but I can’t remember which one

Rewriting Git History (locally and at GitHub)

When fixing the ASP.NET WCF REST help page ‘Memory gates checking’ error at AppHarbor,

I ended up with a number of Git Commits: locally

and at GitHub

Once we found the solution (and pushed a new version of FluentSharp.CoreLib.dll to Nuget), it was time to clean up the history (since those commits don’t really need to be in the main TM master). Yes I could had used a branch, but since this was part of the TeamCity deployment tests, It was useful to do it on the master branch (and see how fast TeamCity can be :) )

So what we want to do, is to something that is not very common in Git: rewrite Git’s history (i.e. remove pushed commits). In practice this means that we want to ‘go back’ to the commit marked in blue (gitk image above) ,and remove the extra commits from the main Git History:

Just in case something goes wrong, lets let’s backup the current changes as a local branch :)

Once that is done, we do a ‘forced git reset’:

This does the trick, and now the (local) history looks good:


Next, to remove these commits from GitHub …

….we do a _git push –force _

And GitHub’s version has been moved back:

Finally we update ( on the TeamMentor VS Projects) the FluentSharp.CoreLib.dll via NuGet:

and commit it:

Creating the desired Git Commit History:

4. January 2013

  • GitHub is having some probs today
  • Dangerous bug between Git, GitHub and Windows (duplicate directories with different capitalization)
  • Can Git be used instead of Word’s ‘Track Changes’

GitHub is having some probs today

Here is what a GitHub Commit page looks like:

Here is the status page

And the Status messages page:

Dangerous bug between Git, GitHub and Windows (duplicate directories with different capitalization)

After doing this rename, here is what GitHub looks like:

Note the two Web Applications folders!!!

I have hit this issue in the past, and I think it is something to do with Windows capitalization inconsistencies (as seen on the windows search), Git and GitHub

Note how my local folder looks good.

Even weirder is the fact that the commit message looks good (note the renames):

Let’s try to do a git clone to see if what happens locally:

Humm, the clone looks good, so I wonder if this will self correct after the next commits:

Question: Any ideas on how to fix this?

My only thought at the moment is to:
a) delete that folder (from Git)
b) commit and push the delete
c) add that folder
d) commit and push

Can Git be used instead of Word’s ‘Track Changes’


Text changes are just a simplified version of source code :)

Here are a number of really amazing ‘non-code’ stuff that is happening with Git’s content-versioning capabilities:

5. March 2013

  • Seeing an NGit Diff by using reflection to access the internal Sharpen.ByteArrayOutputStream Class
  • needs to improve their the ‘Normal’ status definition and error reporting dashboard
  • Another GitHub ‘Normal’ status that doesn’t allow me to Push
  • Prob with (older version of) NGit where it was failing to create Git repositories in Azure/TeamCity
  • Creating a version TeamMentor which uses the new GitUserData.config file
  • Changing the ‘View TM article by anonymous users’ status via GitHub
  • Creating a new TeamMentor test site using TeamCity, GitHub and Azure
  • Using Git Branches to fix Issues added to TeamMentor’s GitHub repository
  • Extracting content files from a Azure deployed version of TeamMentor (pre 3.3 git support), starting with a failed SFTP attemp and ending with a CSharp REPL script
  • Using NGit to create native Git support in Azure deployed app (with automatic pushes and pulls)

Seeing an NGit Diff by using reflection to access the internal Sharpen.ByteArrayOutputStream Class

I was trying to get the NGif diff output stream, but hit on an issue that the Sharpen.ByteArrayOutputStream class is internal

Here is an example from NGit UnitTests on how to use the NGI Diff Command:

The key part is:

Note how the Sharpen.ByteArrayOutputStream was created and used on** Diff.SetOutputStream, but (as we will see below), we will have a problem because this class is internal:

On an O2 Platform C# REPL script, lets create a quick repo and a valid Diff result:

Our objective is to get the Diff formatted output shown in the NGit Unit test.

A quick look at the Diff class, shows no public fields, properties or methods that expose it:

And by default the out field is null:

Basically what we need to do is this:

But as you can see, we can’t create an instance of the Sharpen.ByteArrayOutputStream

Well, we can’t create it directly, but we can easily create it using reflection :)

To do that, lets start by getting an reference to the Sharpen.dll assembly

then add a reference to the ByteArrayOutputStream type

invoke its constructor to create a live instance of it:

since the Sharpen.OutputStream class is public, we can cast our ByteArrayOutputStream object into it:

we then assign it to the NGit command, which will give us the diff log we wanted

Note that the out field is now not null:

Here is the Source code of the C# code snippet created:

 1 var outputStream = "Sharpen.dll".assembly()  
 2                                 .type("ByteArrayOutputStream")  
 3                                 .ctor()  
 4                                 .cast<OutputStream>();
 7 var tempRepo = "tempRepo".tempDir();  
 8 var nGit = tempRepo.git_Init();
10 var file = "testFile.txt";  
11 nGit.writeFile (file, "some content\naaaa\n");  
12 nGit.add (".",false).commit_using_Status();  
13 nGit.writeFile (file, "some content\naa Change\n");
15 var diff = nGit.Git.Diff();
17 diff.SetOutputStream(outputStream).Call();  
18 return outputStream.str();
20 //using Sharpen  
21 //O2Ref:FluentSharp.NGit.DLL  
22 //O2Ref:NGit.dll  
23 //O2Ref:Sharpen.dll needs to improve their the ‘Normal’ status definition and error reporting dashboard

At the moment (i.e. when I wrote this post), all should be ‘Normal’ with GitHub, since according to their status, their world looks like this:

But to me here, a simple/small git push took about 5 minutes of retries:

which doesn’t look ‘Normal’ to me!

I think needs to improve their the ‘Normal’ status definition and error reporting dashboard

Note that yesterday was the same thing.

Something weird is going on in GitHub’s land

Another GitHub ‘Normal’ status that doesn’t allow me to Push

It took me 5 attempts over about 10m (from 7:40pm on 11th Mar 2013) to get a commit pushed into GitHub’s servers:

It looks like GitHub is having DDos probs, which is another reason why they need to improve they visibility into what is going on:

Prob with (older version of) NGit where it was failing to create Git repositories in Azure/TeamCity

Using an NGit version from a couple months ago.

 1 var userHome = Path.Combine(Path.Combine(HostingEnvironment.ApplicationPhysicalPath, "App_Dat\
 2 a"), "git");  
 3 //using System.IO;  
 4 //using System.Web.Hosting;
 6 var runTime = "Sharpen.dll".assembly().type("Runtime");
 8 var properties = (Hashtable)runTime.invokeStatic("GetProperties");
10 var result = "There are {0} properties <br>".format(properties.size());  
11 properties["user.home"] = userHome;  
12 //return properties["user.home"];  
13 foreach(DictionaryEntry item in properties)  
14   result+= "{0} = {1}<br/>".format(item.Key, item.Value);
16 return result;
18 var tempRepo = "tempRepo".tempDir(false);  
19 try  
20 {  
21   var initCommand = NGit.Api.Git.Init();  
22   initCommand.SetDirectory(tempRepo);   
23   initCommand.Call();  
24 }  
25 catch(Exception ex)  
26 {  
27   return ex.Message + ex.StackTrace;  
28 }
30 //O2Ref:NGit.dll  
31 //O2Ref:Sharpen.dll  
32 //return new API_NGit().init(tempRepo).str();  
33 return tempRepo.isGitRepository();
36 //using O2.FluentSharp;
38 /*var runTime = "Sharpen.dll".assembly().type("Runtime");  
39 var properties = (Hashtable)runTime.invokeStatic("GetProperties");  
40 var result = "There are {0} properties <br>".format(properties.size());  
42 foreach(DictionaryEntry item in properties)  
43   result+= "{0} = {1}<br/>".format(item.Key, item.Value);  
44 return result;  
45 return "FluentSharp.NGit.dll".assembly();*/
47 //using System.Collections;  
48 //O2Ref:TeamMentor.CoreLib.dll  
49 //O2Ref:FluentSharp.NGit.dll  

I would get this exception:

The error is the one documented here:

After forking the repo and building it locally in VS 2010, I run the UnitTests (note: only one of 2284 tests failed after a couple retries (there were about 20 that failed on fist execution, but passed to individual retest)):

I then added the compiled assemblies to TeamMentor, and now this script:

 1 var properties = Sharpen.Runtime.GetProperties();  
 2 var result = "There are {0} properties <br>".format(properties.size());  
 3 //properties["user.home"] = userHome;  
 4 //return properties["user.home"];  
 5 foreach(DictionaryEntry item in properties)  
 6     result+= "{0} = {1}<br/>".format(item.Key, item.Value);
 9 var tempRepo = "tempRepo".tempDir(false);  
10 try  
11 {  
12   var initCommand = NGit.Api.Git.Init();  
13   initCommand.SetDirectory(tempRepo);   
14   initCommand.Call();  
15 }  
16 catch(Exception ex)  
17 {  
18   return ex.Message + ex.StackTrace;  
19 }  
20 return tempRepo;
22 return result;
25 //using System.Collections;  
26 //O2Ref:Sharpen.dll  
27 //O2Ref:NGit.dll  

Executed on the O2 Platform Browser-based C# REPL:

Creates the local repository on a local folder.

NOTE: For reference here is how to write that reflection script using the O2 API;s

 1 var runTime = "Sharpen.dll".assembly().type("Runtime");  
 2 var properties = (Hashtable)runTime.invokeStatic("GetProperties");  
 3 "There are {0} properties".debug(properties.size());  
 4 foreach(DictionaryEntry item in properties)  
 5     "{0} = {1}".info(item.Key, item.Value);
 7 //using System.Collections  
 8 return aPI_NGit;
10 //O2Ref:FluentSharp.NGit.DLL  
11 //O2Ref:Sharpen.dll  

Creating a version TeamMentor which uses the new GitUserData.config file

Introduced in the 3.3 version of TM is a new feature to load the UserData repository from an external location (GitHub or local folder).

This post shows how to set it up.

First step is to get the latest version of TeamMentor from GitHub, where we can clone it locally or download the zip file

Using the Zip file has an example, unzip the 10Mb file into a local folder, and click on start_TeamMentor.bat

This will open an empty TM site, and a new Library_Data folder should had been created:

With this default structure:

Now in GitHub (or on a local folder), create a new Git Repository (which should be marked as private, since security sensitive data will be stored here)

Once the repository is created, copy its git url (in this case )

Back in the local copy of TeamMentor, open the TBot page:

which will require an admin account:

After login, open the Edit GitUserLocation

And enter the Git url copied from GitHub:

After the data is saved, go back to the commands list:

Go to the Reload Server Objects

And click on ‘Reload UserData’:

After that step is completed, if you look at the Library Data folder, you should see a new UserData folder in there (that uses the git repository name as part of its path)

Inside it, you will see the that was received from GitHub, and a new TMSecretData.config and Users folder

Back in TBot’s page, if you click on any link you should be redirected to the login page, and you will need to login again using the default admin credentials (this happens because the current browser cookie is pointing
to the admin user that is in XmlDatabaseUser_Data and not in the newly created

After logging in, open the Edit SecretData command:

Which should look like this (with correct values for the Rijndael and SMTP fields):

The value that we want to change is the Libraries_Git_Repositories, which should point to the Git repo we want to add to this TM instance. In this case:

Add the git url as an item in the Libraries_Git_Repositories Javascript array:

After the data is saved, open the Reload Server Objects again:

And this time around click on the Reload Cache button:

Once that is completed, if your open the XmlDatabase/TM_Libraries folder, you should see a new Vulnerabilities subfolder

Which is in fact a git clone:

of the git repository configured on the Libraries_Git_Repositories value

Quickly opening the main TM page, will now show the Vulnerabilities Library:

Final step is to do manually commit the changes made to the local repository (note: auto commit and push is disabled on the UserData when running TM from localhost)

Which will put those updates in GitHub

Now that we have this GitHub repository configured, we can configure the Git UserLocation of live QA server:

And after reloading the cache:

The will now have the Lib_Vulnerabilities library

Changing the ‘View TM article by anonymous users’ status via GitHub

From the 3.3. release of TeamMentor (TM) it is now possible to change configuration settings of live servers directly from GitHub.

For example I just published a QA version of the site on Azure’s

Here is what (on version 3.2.3) looks like:

Here is what (on version 3.3 RC4) looks like:

Can you spot the difference?

Here is the file (on GitHub) that controls if Anonymous users should be able to see TM’s articles:

So the solution is to edit this file (in GitHub):

change that value to true:

Commit that change in GitHub’s UI:

With this commit being now part of this repository:

Next we go into the new Tbot interface ( ), which requires admin privs:

After login , open the ‘Reload Server Objects’ page

And click on the Reload UserData (and Git Pull and Push) button

which when executed:

will have updated the local TMConfig.config file:

And if we logout, we will see the expected behavior:

Finally, if we look at GitHub’s commit history, we will see the commit we did in GitHub nicely merged with the commits that happened at the live server

Here is the GitHub’s Network Graph of this repository, which shows both types of commits (the ones performed at at the live server vs the one at GitHub)

Creating a new TeamMentor test site using TeamCity, GitHub and Azure

Serge just asked me to create a new TeamMentor (TM) website for him using a particular TM library, so here are the steps I took (note: some of this will be automated in the next TM release)

In Azure ….

It all started by going into Azure and creating a new website:

In this case called tm-hashes

in a couple secs it was available

next I set-up git publishing:

using ‘Local Git’ since that works well with TeamCity and doesn’t require that Azure is given pull privileges into the target repo:

Azure worked for a bit, and after a couple secs I had:

In TeamCity …

Next, in the TeamCity server, to make it easier on next deployments, I added this site to one of the builds that already pushes other sites into Azure:

Specifically I added another build-step similar to the ones already there for tm-vulnerabilities and tm-DennisGroves

The easiest way to do it, was to create a copy of one of the existing steps:

After editing the ‘copied build step’, called Publish to Azure (tm-Hashes), I quickly reordered the build steps



Next, I clicked on Run:

in order to trigger a TeamCity build:

That build will:

  • do a git pull from the latest version of the TM code (currently at 3.3 RC3.01),
  • build the VisualStudio main project,
  • and push to Azure

Here are the build logs during the ‘push to Azure’ step:

At this moment the Azure admin page for the tm-hashes site, will show a Deploying message

which becomes Active Deployment once it is completed:

One of the nice hacks the Azure team did with their git implementation is to provide good messages/info in the git data sent back on pushes (the lines in dark-orange below where created by Azure):

Once the push in complete, we can browse the Azure site:

And see an empty TeamMentor installation:

In TM Control panel using GitHub zip file …

The final step is to add the library that Serge wants (note: for this example I’m going to install the library via drag and drop of zip files.)

To get the library files, I went to its private Github repository, and clicked on the zip button

which downloaded the zip file into my current vm:

back in TeamMentor,

I logged in:

as admin

went to the control panel:

clicked on Advanced Admin Tools and Install/Upload Libraries

drag-n-dropped the local zip file into the Upload a file red box

Depending on the network speed, this can take a couple seconds or minutes:

Once the upload was completed,

I clicked on the ‘’ link that appeared at the top:

waited a little bit:

and once I saw the success message:

I clicked on Open Main Page

and the main TeamMentor GUI showed the imported libraries:

The final step was to remove the original temp library (created by TeamMentor on first install)

And finally, I sent Serge an email with the link to his brand new install of TeamMentor :)

Using Git Branches to fix Issues added to TeamMentor’s GitHub repository

This is the currently workflow that I’m following when coding/fixing TeamMentor Issues added to the TeamMentor/Master/Issues list.

  • Find issue to address
  • Create and checkout new branch (with the issue ID on its title)
  • Apply the fixes (on the new branch)
  • Commit the changes (on the new branch)
  • Checkout master branch
  • Merge changes (from new branch) into master , using the –no–ff (no fast-forward) option (this is very important, see here and here for a good explanation why)
  • Push to GitHub

Lets look at this in action.

Here is a simple issue to fix:

In GitHub, the issue is #389, so on a local clone of the Master repository, we create a branch called Issue_389 (using the –b switch to create it)

In VisualStudio, apply the fix to the code:

Quickly look in a browser to confirm the change (this should also be reconfirmed via a UI UnitTest):

Commit the change to the Issue_389 branch:

Which means that at this moment, there is nothing else to commit on the Issue_389 branch:

which is now one commit ahead of master

Next step is to checkout _**(into) master and do the **_git merge using the _–no–ff _

Gitk shows the effect of the –no–ff (ie. the use of the branch was preserved)

Final step is to push the commits to GitHub:

Here is the commit at GitHub:

Here is the GitHub’s Network view:

Extracting content files from a Azure deployed version of TeamMentor (pre 3.3 git support)

I was asked by Serge to retrieve some changes he made to a test version of TM hosted in Azure.

This site was hosted at and since this was a version before the built-in Git Support (where git TM Libraries are natively supported by TM), the only way to get the files was to copy them from the live server.

So my first attempt was to use SFTP (which Azure supports) to connect directly to the web root.

To get the STFP address, I went into Azure’s control panel for the tm-hashes site:

Copied the SFTP address and opened it in local windows explorer (which will require login into the tm-hashes\tmci account):

Navigated to the site’s webroot (which are the deployed files)

Then into TM’s Library_Data folder, which is located inside the App_Data folder (because this is the main location where this site’s IIS account has ‘write privileges’)

In there we can find the XmlDatabaseTM_Libraries folder which contains all libraries currently loaded in this server

Back in my local VM, I have clone of the target git repository (

into which I’m going to copy the folders from the SFTP live site:

After some time of starting the drop ….

…. I confirm the overwrite

…. and it looks like it will take 1 hour

or maybe 6 hours :

After 38m of copying it is now down to 3h

hummm….. damm (after 1h and a bit of copying)

That kinda sucks

I think I need to try a different approach :)

Time to open up the C# REPL included in TM:

Get the path to the Xml Libraries folder:

Confirm that the folders we want are in there:

Lets first try to zip the forth library:

Which is the Android one:

After the script shown above executes, a quick look at the ftp site shows the expected in there:

Which contains the files we want to get:

Next lets do the zip of the entire Libraries folder:

It takes about 1m to create the 11Mb file

Which can now be copied locally

…note how this is much faster than the multiple hours wait we experienced above:

Once the zip file is downloaded:

I unzipped the files into the Lib_Hashes folder:

Choosing to overwrite the existing files:

After the copy, Git will pick up the changed files:

Which we can commit:

And push to GitHub:

Using NGit to create native Git support in Azure deployed app (with automatic pushes and pulls)

This entry will show a pretty powerful new feature in TeamMentor (TM) which I’m very proud and excited about!

This feature is so important, that it literary caused a delay on the release of TM 3.3 for about 1 month (my instinct was pushing me on this direction since I ‘knew’ that this could be done, and that it would be a killer feature). Btw, there is a lot more NGit/Git support than what is shown here, but I’m sure you will see the power in the workflow described below.

Basically, TM’s backend engine will now automatically perform:

  • a git pull when the TM server starts (or it cache is rebuilt)
  • a git commit followed by a git push on every library edit (on both content and structure changes).

And since TM uses the .NET Library NGit, what we have here is a pretty powerful self-contained .Net-based ‘git for content versioning’ solution.

Practically speaking, this is a Git workflow that runs on Azure-hosted-site without requiring Git to be installed on the live servers!

This solves the problem created by the lack of git.exe (and supporting files) on an Azure’s deployed web application (Azure’s git support is limited to pushing code to Azure’s servers, which will trigger MSBuild-like website publishing workflow)

Here is a example of this managed Git workflow in action.

The site:

is currently configured to use the UserData from this repository

which contains a reference to the repository:

In practice, what this means is that the TM articles we see in are the ones hosted and managed by the repository.

But since the mapping is done via NGit and the account used (in Azure) to connect to GitHub’s Lib_TM4TM has push privileges, it is now possible to make a change in that is auto committed locally and then into

Auto Committing and pushing changes

For example, here are the last commits at

In the server, lets’ add an new Guidance Item (i.e. an Article)

Since (in TM 3.3) the articles are created immediately, by the time the editor is shown, the article shown in the popup-window will already exist on disks (i.e. there is already an 384ed731-96a1-4c00-a830-345abfc827e2.xml file on the server)

And with the new ‘auto Git commit’ feature, the git commit of the new article will be available (after a couple secs) at GitHub:

This ‘commit on new article’ is made of two file changes:

  • The new article (the 384ed731-96a1-4c00-a830-345abfc827e2.xml file)
  • The mapping of the new Article’s GUID to the chosen ‘view’ element (which is part of the Tm4TM.xml library xml file)

Next let’s make some changes to the new article:

After saving, the article is now available at or

As before, there is a new Commit at GitHub (

Which contains the ‘metadata changes’ and the new article’s html content (created by the WYSIWYG TM online editor):

And since this is all Git based, many more complex and multi-user/hosting scenarios are easily supported (for example I can have a local copy of the TM4TM server/repo which I can edit offline and push to GitHub (directly or via Pull Requests)).

The git merge strategy is the same used by GitHub:

  • If there are no conflicting changes, everything happens automatically or via GUIs pages
  • If there are merge conflicts, the Git Bash and Windows Diff tools should be used to address them

6. April 2013

  • Git pulling a TeamMentor Library and renaming it
  • Creating QA versions of TeamMentor UserData repository, and using branches to show/test the multiple config options
  • Changing a User’s ExpiryDate from GitHub hosted file
  • Linus gift to the world will be Git not Linux (and what about an OS built on top of an hash-driven file system?)
  • What the move from HTML to WikiText looks like (in GitHub)
  • Is Git a Single point of failure for TeamMentor?
  • Setting up Ian’s CI Development Environment (for TeamMentor)

Git pulling a TeamMentor Library and renaming it

Here is an example of how to use the new TM 3.3 capabilities to load libraries from GitHub and to rename them.

Let’s start with a version of TM that looks like this:

And let’s say that we wanted to add the TM4TM Library to this server

First thing to do is to copy the Git’s Read-Only Url

And add it to the TBot’s Secret Data file:

Before we reload the cache (which will do the git pull using NGit), lets see what the Library’s folder looks like.

In this instance of TM, as we can see by the TMConfig.config file:

The Library files are located in the TM_Libraries_12 folder:

And if we now trigger the cache reload:

We will see that there is a new TM4TM folder:

which is a git repository

with its remote set to the Git;s Read-Only Url

After the cache reloads:

There are now 8 Libraries loaded in TM:

The reason for the extra 6 libraries (when we only added one new repository) is that from TM 3.3, there can be more than one library file in library folder (note: the recommendation is have one library xml file per folder)

Also note that the library name/caption is now independent from the xml file name:

Let’s now open TM’s Edit mode

and use it to rename the TM4TM_RTST2 Library:



After the rename, a number of thinks happened.

1) The TM4TM.xml library file contents changed:

2) There was a local commit with the change:

3) the auto pull to GitHub failed

This is confirmed by the commit list at GitHub:

and the ‘push error’ we got on the TBot’s DebugInfo page

In this case I do want to push the changes, so back in GitHub I copied the SSH git url

And use it directly on a git push (I could also had done this by setting up a new remote)

Now the commit created by TM (on library rename) exists in GitHub:

Removing the extra Library files:

Since we don’t need the extra libraries xml files, I just removed them (and committed the changes)

Which means that after cache reload,

there are now 3 libraries in my local TM instance:

Creating QA versions of TeamMentor UserData repository, and using branches to show/test the multiple config options

Now that a number of TeamMentor settings can be configured from the UserData repositories, we need a way to test and document what can be done.

Let’s start by creating a public GitHub repository ( to hold the multiple examples/tests:

Set the GitUser location to it:

And reload the user data:

Once the reload is complete, there will be a new folder called User_Data_Git_UserData_Customizations in the local XmlDatabase folder:

Note that if you are running TM from localhost (as in the current example) then the user data will not be auto committed (due to the dynamic nature of UserData, if GitAutoCommit was enabled it would not be possible to load userdata repositories used on live TM sites (like the multiple Site_nnn.git ones) without creating commit conflicts):

Before we move to the branches let’s commit the current TMSecretData.config and admin files:

Use Case #1: Changing version by running Customized Javascript code

The first example is going to show how to execute some Javascript in the main TM Gui from a file provided in the user data folder.

Let’s create a branch to hold the changes:

Add a folder called WebRoot_Files:

Add a folder called _Customizations (inside the WebRoot_Files)

Add a JavaScript file called TM_Custom_Settings.js inside the _Customizations folder:

Note: the reason for this file, is that it is automatically included (if it exists) in the Javascript consolidated download that is done on the main TM GUI. Here is the mapping file that also shows the execution order of this script:

Next edit the TM_Custom_Settings.js file and use it to (for example) change the TM.tmVersion value.

And in TBot , trigger a Cache Reload

If you keep an eye in the TM_WebSite folder (the root of the TM website), you will see that it looks like this before the the Cache Reload

and like this after the Cache Reload:

What happened is that the contents of the UserData’s WebRoot_Files was copied into TM’s web root.

Which means that the TM_Custom_Settings.js created above is now part of TM:

A hard refresh of the browser, will now show the customized TM.tmVersion value:

To wrap up this branch, let’s modify the file:

Commit the changes:

And push the branch to GitHub (note the explicit branch mapping on the git push command):

A quick look at GitHubs repo:

will show our branch and modified files:

Use Case #2: Setting Google Analytics (server-side) value by running Customized C# code

In this example we will set enabled and configure the Server side google analytics settings (used to provide metrics on TM usage)

Note: see the Running Customized C# code loaded from TeamMentor’s UserData repository post to understand the role of the FirstScriptToInvoke.h2 script

Let’s create a new branch called UseCase_CSharp_Customization using as a starting point the existing UseCase_Javascript_Customization

Modify the WebRoot_Files_CustomizationsTM_Custom_Settings.js file:

In the User data repository (which is checkout into the UseCase_CSharp_Customization branch), add the H2Scripts folder

Inside that folder add the FirstScriptToInvoke.h2 file

Which (as explained in this post) will be executed on startup.

Before adding code to the FirstScriptToInvoke.h2 file, its to good to test/debug that code using the TM’s CSharp REPL:

and confirm on the log viewer that Google Analytics (GA) calls are now being logged:

Once we’re happy with the C# snippet to execute, we can add it to the FirstScriptToInvoke.h2 file:

To double check that the Google Analytics (GA) settings are being configured on TM setup, I restarted Cassini, and confirmed that the log viewer shows the FirstScriptToInvoke.h2 file execution:

and the successful configuration/use of Google Analytics:

Final step is to update the file

Add the files to git

Commit the changes:

And push Commit into a new branch at GitHub:

Changing a User’s ExpiryDate from GitHub hosted file

For the cases where TeamMentor UserData is loaded from a GitHub repository, it is possible to change/manage user data directly from GitHub’s web GUI (or from a local clone of that repository).

Lets take for example Danny’s account, which is expired at the moment (today is 4/10/2013):

In GitHub, this is the file that contains Danny’s user data:

So we open and edit that file:

Change the ExpirationDate to a value in 2014

Commit the changes:

Reload the UserData:

And the Danny account details at the server is now set to the new date:

This is one of the nice side effects of having the ability to push TM’s user data into a Git repository (another advantage is that we now have fully backed-up, logged and hashed user’s change-history)

Linus gift to the world will be Git not Linux (and what about an OS built on top of an hash-driven file system?)

I know it is a big claim, but I think that Linus Torvalds will be more famous for creating Git than for this work on Linux

Linux is a great example of OpenSource development and a good OS. Its impact is mainly technical and behind the scenes.

Git is a hashed-based file system with built-it version control. Its impact is not only technical but social.

The more I use Git, the more I appreciate its beauty, simplicity and ability to scale while handling complex workflows.

See A must watch TED talk about GIT and democracy for an example of how Git can/will be used to change how information is managed in our society.

Also think about the power of having a ‘Git Powered’ OS (where all files and actions are Git controlled/tracked). We could finally get a lot of security, resilience, quality assurance and traceable from the multiple software/APIs/Apps that we use/consume.

Git also allows its technical users (like me) to be creative in finding ways to improve their productivity and workflows. See Changing a User’s ExpiryDate from GitHub hosted file or these Git, GitHub and NGit posts, for examples of the wide range of areas that I have been using git for (in TeamMentor and O2Platform development)

What the move from HTML to WikiText looks like (in GitHub)

Copy and paste of HTML is such a mess (even today in 2013).

I just converted a couple (converted from Word doc) TM articles from HTML into WikiText and it is shocking the difference in the amount of code (and complexity)

Example #1: Bug Database Definitions

Here is what the article looks like:

And here is the ‘Html to Wiki’ Commit

Example #2: Where to Post TeamMentor Issues/Comments

And here is the ‘Html to Wiki’ Commit

In this 2nd case, ironically the WikiText version looks better (look at the difference with the screenshot below with the above) because of the lack of HTML formatting mess:

Is Git a Single point of failure for TeamMentor?

Danny is getting into Git and just asked-me this:

“is it possible for Git to be a single point of failure for TM? If Git went down or offline, wouldn’t that be a problem?”

The short answer is “NO, in fact Git is a distributed point of success for TM”

Let’s start with the differences between Git and GitHub.

Think of Git as a ‘file-based database of multiple versions of a particular file, with one version shown in the file system’, i.e. ‘Git’ is the .git folder and a checkout version of the files (in the file system)

Think of GitHub as a ‘web based location to store and share the .git folder’

This means that a Git repository doesn’t go ‘down or offline’. A Git repository is just a .git folder, and if you wanted to remove Git **from a particular folder/repository, you could just delete that .git folder (and you would be left with the latest ‘checked-out version of the files’)

In terms of GitHub going down, there are two main scenarios:

1) GitHub loses the .git folders that it hosts (i.e. it loses the git repositories) - this would be a pain, but as long as there is one clone of those repositories, there shouldn’t be much/anything lost. This is why I say that ‘Git has distributed points of success’, basically, every clone or fork that exists, is in effect a backup of the code (there are a couple things like remote information that is not cloned, but those are minor)

2) is down - the impact will depend on how long this happens, at the moment we do use GitHub as a central way to distribute and publish source code between TM devs and servers, so if GitHub is down (more specifically, GitHub’s Git hosting service, not the website), then we can’t push the commits made (note that the devs can still work on the local git clones, and TeamMentor users can still edit Git based Libraries). Even in the case where GitHub is down for a significant time, there are easy solutions to implement (specially when you compare with the fact that if GMail, Google or Twitter goes down, there is nothing we can’t do). Part of the power of Git is that all commits are tied to its Hash, which means that while GitHub is down, we could host our own git server (see here how I did it using apache) or push commits directly to Azure (which can also act as a Git Server). Then once GitHub is back online all we need to do is to push the commits to it.

The other scenario that could happen on GitHub is that those repositories are maliciously manipulated (lets say by an internal attack that is able to add extra commits). But since of the key advantages of Git is that it is a DRCS (Distributed Revision Control System) and the entire version history is present on all clones/forks, in practice, the next ‘real’ commit would fail, which should raise the alarm (i.e. that developer would need to do a git pull before it could push the code)

The way I look at Git is that it creates a ‘virtual file system, fully hashed and with version control’, which you can read more about in the post: Linus gift to the world will be Git not Linux (and what about an OS built on top of an hash-driven file system?)

Setting up Ian’s CI Development Environment (for TeamMentor)

Now that Ian (and Kofi) have pushed a couple commits (to his fork of TeamMentor) its time to set-up Ian’s CI dev environment, so that his commits can be automatically tested and viewed on a live instance of TeamMentor.

First think to do is to go to Azure and create a website to hold Ian’s Fork

Which is going to be

See the post Creating a new TeamMentor test site using TeamCity, GitHub and Azure for more details about how TeamCity is usually configured. The only major change for Ian’s version, is that TeamCity is going to track the ‘azure’ branch (vs the master branch)

(See my next post for more details on how this azure branch was created)

Once TM is set-up and this TeamCity build is ‘Run’ , the Azure site goes from:



And when completed we will have a clean TM site based on Ian’s repository (azure branch):

Finally, so that Ian has some data to play with, I logged in as admin and quickly added a couple libraries:

Running NUnit tests in TeamCity:

I also configured TeamCity to run the NUnit UnitTests from Ian’s solution.

And in this case there is 1 test that fails:

which is also failing locally (i.e. in VisualStudio):

Which means that there is a side effect of one of Ian’s code changes (which he will need to fix on his repo :) )

Confirming Git to TeamCity to Azure

To double check that the workflow is working ok, let’s make a file change:

In the TM Website Settings,js file:

lets append Ian’s name to the build version:

This change is picked up by Git:

Where we can commit it locally (note the ‘azure’ branch)

and Push it to GitHub:

Making sure both local and remote branches are azure:

After the push is done:

TeamCity with trigger the build (TeamCity checks for new GitHub commits every 60sec):

Azure will be pushed the new version (which is fast):

A refresh of the site confirms the change (notice the ‘Ian’ next to the version):

Re-deploying a previous ‘Azure deployed’ version.

Note that it is also possible to go back a couple deployed versions in Azure. Just select the deployment and click on ‘REDEPLOY’

Which will change the active deployment:

7. May 2013

  • Example of two TeamMentor sites using the same GitHub Content Library
  • AzureGate - how Azure’s ‘subscription upgrade’ crazy mode caused us to stop using Azure for VM Hosting (and Git+GitHub saved the day)
  • Great presentation on Git Branching (very similar to the model we are using in TeamMentor)
  • Fixing bug in TBot user editor via Git merge of fix developed on another repo’s branch
  • Releasing HotFix 1 for TeamMentor 3.3 (using Git to deploy updates to live servers)
  • Great post - Git: Who cares about branches? It’s all about collaboration and code reviews
  • Creating website using GitHub Pages (with screenshots of all design options)

Example of two TeamMentor sites using the same GitHub Content Library

Now that we TeamMentor 3.3. is able to automatically commit, pull and push from live GitHub repositories, we are able to support quite interesting set-up and workflows.

For example at the moment there are two live TM4TM sites:

Both are configured to consume data from the repository:

Yesterday, Serge was making some changes on the server, which where automatically committed (locally) and pushed into the Lib_TM4TM GitHub Repo:

For example, here is one of the articles that Serge changed:

which looks like this on the server

and like this on the server

So at the moment the same page has different content on these servers.

There are two ways to fix this:

  1. make a change on the server (which will trigger a git pull and push), and do a cache reset
  2. do a cache reset (which will also do a git pull)

Since a cache reset will be needed on both cases, that is our best option in this case.

So, I opened the TBot page for the server:

And triggered a cache reload:

A quick look at the server logs confirms that a git pull took place:

And the is now updated with the latest content:

Note that the current plan is to run TBot as a constant server thread, which will then be able to monitor the GitHub’s content repository and automatically do git pulls (when needed).

AzureGate - how Azure’s ‘subscription upgrade’ crazy mode caused us to stop using Azure for VM Hosting (and Git+GitHub saved the day)

Late last night all the main TM hosted sites went down!

The reason is Azure’s crazy ‘subscription expired’ workflow which you can read what other Azure users had to say about it when I happened to them on and Microsoft’s view on it (note how the crowd in the comments are not happy with it)

Below is the email I sent internally at SI, with my debrief on what happened:


Ok so we are back online with

Here are some notes:

  • The outage lasted about 14h
  • This is an example of ‘worse case scenario’ where our ‘Data center’ effectively went down
  • The good parts is that:
  • The prob was picked up quite quickly
  • We had a ‘fully working’ site (i.e. with the content and users configured) up in about 15m (namely the one I created which was available on a direct IP)
  • Roman was able to create a replacement server (for all 4 sites) in about 2hours (and that could had been 30m if Roman had not hit on a permission issue (and had done this type of deployments before))
  • The fact that TM’s team spreads a multiple time-zones allowed us to react quickly
  • Due to TM’s current architecture, if we did had customers who NEEDED to have access to TM guidance ASAP (see comment below) we would had several solutions for them (including giving them a full download to run locally)
  • This was the first time we really put the new 3.3. TeamMentor’s ‘auto commit and push to GitHub’ architecture in action, and I’m happy that it worked quite well (we can still fine tune it a bit, BUT this could had been MUCH worse (note that than even now we still don’t have access to the old TM VMs since the VM login accounts are not working as expected, so if the userdata was not on GitHub, we would had not been able to restore the users)
  • The bad parts is that:
  • Nobody really cared :(
  • Where were the ‘urgent call for TM to be up?’, twitter hashtag of TM failure!!!, ‘vulnerabilities not being fixed because TM is down’
  • The sales guys were not ‘up in arms’, ‘potentials sales were not affected’
  • Ed or Jason were not woken up in the middle of the night with a “WTF, TM is down!!!” (note: I was about to go to bed when I noticed that something was wrong in TM’s world)
  • Like Roman was mentioned: “We drove for 14h on the wrong side of the road, and nothing happened”
  • We found the hard way that we can’t trust Azure for live sites
  • Azure really fucked us up. Roman just confirmed that the Azure GUI a couple days where still saying something like ‘your subscription is going to end in 20ish days’ (and destroying VMs and it’s configuration is a very crazy way to handle ‘account suspension for lack of payment or wrong subscription mode’)
  • We also found a ‘single-point’ of failure where MK was the only one that could change the DNS (this is the main reason that it took 14h vs 2h/30m)
  • and since ‘nobody was really complaining’ about TM’s MIA, there was no urgency to contact MK (who was dealing with a number of personal probs related to his “car losing a wheel on the motorway!” )

Since we now have a fully working TM environment, we can spend a couple days thinking about the best place to host the TM VMS.

I like EC2, but we can go anywhere that give us good VM management


Great presentation on Git Branching (very similar to the model we are using in TeamMentor)

Just saw this presentation on Git Branching (embedded below) which is really close to the model we are currently using to manage TeamMentor’s app development.

I really agree with just about everything Lemi Ergin says and this is a great description of the power of Git for branching

Fixing bug in TBot user editor via Git merge of fix developed on another repo’s branch

Here is an example of how I just created a HotFix branch to address an issue we want to push to our live servers asap, and how the fix was developed by Ian in one of this dev branches.

First I created _HotFix _Branch at a (freshly baked) local clone of the TeamMentor/Dev repository:

Then I reviewed the code from Ian’s branch I want to merge:

When happy with the changes, I used a git fetch** to get the latest version of Ian’s fork of **_TeamMentor/Dev

Followed by a $ git merge Dev_Ian/#437-Password-Expiry HotFix_3_3_1

Which did the merge with the_ TeamMentor/Dev_ master branch

A look at GitK confirms that there was only one commit added (the Ian’s ‘Date value being saved to database’)

At this stage if we look at Ian’s network map, we will see that this commit is not linked to another commit (i.e. is the last one of the #437-Password-Expiry branch

Next step is to quickly test if the feature is working ok.

This fix is for the Password expiry cannot be set from the main TM GUI issue (i.e. make the ‘Account Expiration’ field editable).

So I opened an user’s edit page

changed the expiration date:

Saved it

And confirmed that the user’s xml data was changed on the in-memory version of the user xml files:

and on the file system:

My final step was to push the HotFix branch into the live server:

Here are the commits in the new HotFix_3_3_1 branch (note the Ian’s ‘Date value being saved to database’ branch is now there)

And now Ian’s branch is connected with the new HotFix_3_3_1 branch:

same graph without the branch labels:

Releasing HotFix 1 for TeamMentor 3.3 (using Git to deploy updates to live servers)

This is how I updated the 3.3 version of TeamMentor to 3.3.1, which contained a fix for the Password expiry cannot be set from the main TM GUI issue

Since we are now using the Vincent Driessen GitFlow branching model (see also these Git-Flow scripts and this great presentation), after the issue 437 was reported+prioritised, all development happened on a Feature Branch called HotFix_3_3_1 (which was created from a Pull Request from Ian’s own 437-Password-Expiry dev branch).

Once TM’s QA (ie. Roman) was happy with the patch, it was time to push it to the first batch of TM production servers (my responsibility was to update the server, while Roman updated the , and the rest will be updated by Michael K + Michael H). Technically I did a pull from those servers :)

Here are my steps:

1) I RDP into the production server and opened up the respective TM folder (which is a clone of

2) opened up the Git Sync gui

3) and did a Pull

And that’s it, update done :)

I would say that this took me about 1m and Roman says he did his in 58 secs :)

Just to make sure all is good:

4) opened up (note the new version number on top right)

5) and confirmed that all was good (first load took a couple secs since there was a server side IIS W3WP process refresh, caused by the updated bin folder)

6) I then opened TBot’s user management page (for my user)

7) changed the expiration date (this was the issue 437 since it was not possible to change this value from the web (only via GitHub (see Changing a User’s ExpiryDate from GitHub hosted file) or REPL script (see Using CSharpRepl to batch change TeamMentor’s users email and settings))

8) Saved the changes

9) and confirmed that they were committed ok locally (and into GitHub)

GitHub’s Graphs and Git helper pages

One of the great things about using GitHub as part of our workflow is that we can use its Graphing capabilities to visualize what is happening.

For example

Here is the last commit:

Which just changes the version number

Note that this commit happen on the HotFix branch. So the next step is to push it into TeamMentor/Master (the production repo)

This is what TeamMentor/Master looks like for 3.3 (before hotfix commits)

This is what TeamMentor/Master looks like after the 3.3.1 commit

And this is what the TeamMentor/Dev Repo looks like after the commit (this is the development clone/fork, which has all the other feature branches)

I also update the Tags (using the commands described here Adding Tags to TeamMentor Master repository)

Here are the git tags before the push:

** Here are the Git tags after the tag push**

Great post - Git: Who cares about branches? It’s all about collaboration and code reviews

This is a great explanation of the power of Git : Who cares about branches? It’s all about collaboration and code reviews

here is the author’s TL;DR:

TL;DR: Using Git has made our team much better by removing barriers to collaboration and code reviews. Those are the real Git benefits, not specific features like fast branches.

I complete agree and it is the ability to easily review code (and send code back for rewrite during pull requests) that really makes git powerful :)

On this topic also read Great presentation on Git Branching (very similar to the model we are using in TeamMentor)

Creating website using GitHub Pages (with screenshots of all design options)

In order to set-up a site for the domain, I just used GitHub Pages to create and publish a brand new site.

This is what it looks like:

What do you think?

Please see below the other design options and let me know if I made a mistake. Also if you want help in editing this site, ping me with your GitHub account, and I’ll give you push access (or fork the repo and send me pull requests)

Here is how I created it:

I went to the** repo, and on the settings page, I clicked on the **Automatic Page Generator

which opened up this page, where I added the content that is current on this blog’s O2 Platform page:

When finished I clicked on Continue to Layouts

And I checkout the multiple design options:



Time Machine:


Leap Day:


** Hack:**




And finally Dinky

which was my preferred choice , so I clicked on Publish, and here it is in action (using the default domain provided by GitHub)** **

Next step was to set up custom domain, which is explained here:

and is basically a case of creating new file:

Called CNAME, with the contents of the domain

After saving it (and local cache flushed), the GitHub pages now point to

8. June 2013

  • Creating TeamMentor release 3.3.2 (3.2 version with HotFix 2)
  • Fixing a couple bugs and pushing new TeamMentor 3.4 Dev Version (from 4 to 5)
  • Gource Visualisation of “TeamMentor Git Development - 18 Months in 180 Seconds”

Creating TeamMentor release 3.3.2 (3.2 version with HotFix 2)

Now that the two P0 issues are marked as fixed (after a round of QA):

It’s time to publish the 3.3.2 TeamMentor release.

At the moment the code changes are in the 3_3_2_HotFix branch

Which contains the commits that made up the 3.3.2 – RC2 version (with the last commit being the 852d877290)

As set in our release process, to make this the official official version, I will remove the RC2 bit from the version number and make it the final commit for this release.

So I opened a Git Bash on a local copy of that repo TeamMentor/Master (same thing as doing a git clone and pull of the 3_3_2_HotFix branch)

A quick look using git log –decorate –graph –oneline –date-order shows that the latest commit is 852d877 (which matches the version at GitHub that was QAed and checked for this release)

My next steps was to change the version number,

… commit that small change:

… add the v3.3.2 tag (see Adding Tags to TeamMentor Master repository for more details on tagging)

… and pushed into TeamMentor/Master the commit and tag:

Just to confirm, let’s take a look at GitHub:

Main page shows the 3.3.2 commit

… so does the commit page:

… and the Tags view:

…and the Network graph:

The deployment of this version is now passed to the hands of the Infrastructure team, which will update all TM sites currently managed by SI.

On the development side, the last thing to do, is to add this version to the TeamMentor/Dev fork so that it is part of the next release

At the moment TeamMentor/Dev is on this commit

In a local copy of the this repo, I did a pull from TeamMentor/Master

… which failed (on master) because there has been updates done on this repo (since the last merge).

So I created a new branch called 3_3_2_merge

Forced pulled the 3.3.2 code into it (the code from TeamMentor/Master)

switch back into master branch and merged with 3_3_2_merge branch

…which had (as expected) a couple conflicts), with I’m going to resolve using

… which lists the conflicts

… and in this case was mainly the version number:

… easily fixed by making the local version the one to use:

… next I resolved the dlls by selecting one of them (doesn’t really mater since they will be recompiled soon)

… and committed the merge:

….which we can now see on Gitk:

The final step is to push these commits into TeamMentor/Dev

…. which can be seen on the following couple graphs:

The image above shows the TeamMentor/Dev commit done before the 3.3.2 merge, and below is the last commit made

It might be easier to read with out the labels (in blue is the TeamMentor/Dev code in black is the_ TeamMentor/Master_ code:

Fixing a couple bugs and pushing new TeamMentor 3.4 Dev Version (from 4 to 5)

This post shows one way to use GitHub to update the main development branch of TeamMentor.

At the moment TeamMentor/Dev repo is at version 3.3 – Dev 4

This version was pushed on Friday, and it introduced a nasty side effect on the wsdl generation (see issue 546) and a minor bug in the user edit (from the old control panel).

Both probs were picked up by Michael’s TM UI Unit Tests, which is another good example of the power of that type of UI/Browser integration tests.

So, I went into my local dev repo, made the code fixes, checked that they were working and executed all unit tests (to make sure we are still good)

Next I committed the changes locally:

And pushed the commit to the main Dev repo and my personal Dev repo (this is the same as doing a Pull Request and authorizing it via the GUI):

Since TeamCity is configured to run on commit the master branch of TeamMentor/Dev a quick look at its web interface shows that the build started

… after 4m:13s the build was completed, with all unit tests are passed and

… and the Dev QA site pushed into Azure:

Gource Visualization of “TeamMentor Git Development - 18 Months in 180 Seconds”

Here is a pretty cool video of 18 months of Git commits using the Gource tool which is a software version control visualization tool.

Try to see it in full screen and at 1024p HD quality:
Here are the Gource settings used to create this video:

gource.exe -s 1 –file-idle-time 0 –key –title “TeamMentor” –font-size 30 – hide dirnames –date-format “%d/%b/%y” –bloom-multiplier 0.5 –bloom-intensity 0.5 –f

Since the original video was 10 minutes, I used Camtasia Clip Speed feature to compress it to about 3 minutes (~180 sec)

9. August 2013

  • Creating a clone of WebGoat on GitHub

Creating a clone of WebGoat on GitHub

I needed a couple vulnerable source code examples (to use on the new TeamMentor Eclipse plug-in) so an obvious option was to use WebGoat (whose code is currently hosted at Google Code page)

But since there wasn’t a source code download option (in the current download page)

… and this project is not using Git (sorry, but I can’t use SVN anymore :) … it’s too painful)

… I quickly created a clone of it using the $ git svn clone -s webgoat

… which downloaded the entire source code and available history:

When completed (it took a little bit since there was quite a bit of history)

I had this File Structure:


This Git repo Size:

This Git History:

which goes back all the way to 2006!

These Braches:

Note that after the svn clone the current git master branch is the original svn truck.

But as we can see by the above list, there is already an webgoat-6.0 branch going on (in fact most of the recent code updates are done there), so here is how we can create+checkout a git tracking branch for it:

… which will make the file system look like this now:

… and the Git History like this:

Next step is to push this version to the newly created repo (in OWASP GitHub organisation):

On the local repo add a remote:

… and push –all

Once the upload completes:

… the code will be at GitHub:

including the webgoat-6.0 branch:

Finally I updated the OWASP WebGoat page to make references to this new GitHub repo:

And that’s it!

Now you can go to and clone (or download the zip) of OWASP’s WebGoat :)

10. September 2013

  • Git Flow - Moving patches from one Commit into another Commit
  • Example of using GitHub Pull Requests to merge changes made on Branches
  • Script to Git Clone 13 repositories in order to have all TeamMentor Libraries in one folder

Git Flow - Moving patches from one Commit into another Commit

This (longish) post will cover detailed git workflows and is part of the series of blog posts that show how we use the Git Flow workflow to manage TeamMentor’s source code (you will also see practical applications of GitHub’s powerful of powerful features like Network Graphs and Pull Requests).

The key problem that we are going to solve, is the situation created by Michael Hidalgo’s TeamMentor fixes/commits/branches that were done against an commit (38bfcd54d8046372c0ace2409324ecc965761504)** which was originally planed to be part of the next release, but we decided that the next **3.4 Release of TeamMentor will be based on the current 3.3.3 version (with is based on the earlier commit: b97a470ffa173d67a9c74373593eea03eb7a2da4).

The key reason is that he 38bfcd54d8046372c0ace2409324ecc965761504 commit(currently the parent of Michael’s fixes/branches) is not stable and is going now to be the basis of the 3.5_Release (this code contains a number of big changes which need more TLD and testing: native ASP.NET MVC routing, better Git support, native Markdown editor, depreciation of HTML WYSIWYG editor, and more)

In a nutshell, we need to re-apply Michael’s bug fixes to an earlier commit than the one used (i.e. backport those commits).

To start, here is what Michael’s branches look like at the moment (note that all have the 38bfcd54d8046372c0ace2409324ecc965761504 commit as parent):

Here is the commit (38bfcd54d8046372c0ace2409324ecc965761504) that we want to have as the parent, since this is the commit that is currently on the 3.3.3. release (and will be the basis for the 3.4 release of TeamMentor):

Basically, what we need to do is to ‘just’ backport the branches linked to 38bfcd54d8046372c0ace2409324ecc965761504 commit, into the b97a470ffa173d67a9c74373593eea03eb7a2da commit
Note: since this post was getting quite long, I moved some workflows into Appendixes (included below) so that the key actions/changes can be read in sequence.

Using the workflow described in the Appendix 1) Creating patches from Michael’s branches here are the patches to apply (i.e. these are all changes from the branches currently available in Michael’s dev repository):


  • Fixing the master branch and creating a feature branch for the 3.5_Release (so that TeamMentor/Dev master branch is in sync with TeamMentor/Master master brach, and the 3.5 commits are not lost)
    • … see __Appendix 2) Creating a 3.3.3 tag and branch in Dev repository
    • … see Appendix 4) Creating a 3.5_Release Feature branch** )
  • Applying the 6 patches that merged without conflict
    • … see Appendix 3) Applying patches

…we get the following TeamMentor/Dev ‘not merged branches’:

After the pull requests are made into a new 3.4_Release branch (see Appendix 5) Creating a 3.4_Release Feature branch for more details) we have 5 Issues/branches applied (and ready for QA):

Here is the graph view, with TeamMentor/Dev master (blue line below):

…. now being the parent of the Issue_142, Issue_51, Issue_400, Issue_475 and Issue_459 branches:

This concludes the (main part of) this post, which showed how to handle the scenario where fixes (and branches) were applied to a commit whose release schedule was changed (and there was the need to back-port those changes into an earlier commit).

I think it is important to note that the workflow shown in here is a great proof of the power of Git (I can’t even image doing this in SVN).

In fact, in this case, we are paying the price for not being more formal in the use of Git Flow workflows, and for not being more strategic on where we applied simple fixes (like the ones shown here).

I.e. this should be easier next time.

That said, it took me orders-of-magnitude more time to write this blog post, than to actually make these changes/fixes :)

Appendix 1) Creating patches from Michael’s branches:

To create the patches, I grabbed a fresh clone of Michael’s dev repo (which is a fork of TeamMentor/Dev)

Then, on a git bash of this repository, I created a new branch that pointed to the current
38bfcd54d8046372c0ace2409324ecc965761504 commit, using the commands: $ git checkout 38bfcd54d8046372c0ace2409324ecc965761504 _** and **$ git checkout -b Patch_Parent_

The reason I picked the 38bfcd54d8046372c0ace2409324ecc965761504 commit is because this is the commit that all Michael’s current branches are based on:

Using the $ git branch -a command, we can see that this local repository/clone already contains the branches we need:

Let’s start with a simple one, for example the changes on Issue 534:

whose changes are on branch Issue_534

In order to create the patch, I created a local tracking branch using the command $ git checkout -b Issue_534 remotes/origin/Issue_534

I then created a patch using $ git format-patch Patch_Parent

…which created the file 0001-Fixing-Issue_534.patch:

… containing these changes:

…the these ones:

Note: the reason the patch is about 1Mb is because Michael (on this branch) also committed a bunch of *.dlls which should not be there.

One more little thing, since we are going to create a number of these patch files, it is better to put them on a dedicated folder. This can can be done using the command: $ git format-patch Patch_Parent -o ../_3.4_Patches

… with the ‘patch file’ now being placed on the folder:

Here is the same process for Issue_565:

… with the patch created in:

We can also create the patches without creating a tracking branch. For example here is how to create a patch for the code at the Issue_51 branch:

Note that the 0001-Fixing-Issue-51.patch file is much smaller (3k) then the others

This is caused by this patch only containing text diffs (and no binaries), which is how all patches should be:

Finally here are all the patch files created (containing all commits made by Michael’s branches):

Appendix 2) Creating a 3.3.3 tag and branch in the TeamMentor/Dev repository

In order to be able to apply the changes into the _TeamMentor/Master _master branch, I created a branch in the current TeamMentor/Dev that points to the last common commit between_ TeamMentor/Master** and **_TeamMentor/Dev (this way the commits can be pushed into TeamMentor/Master master branch, and eventually pulled into the TeamMentor/Dev 3.5_Release branch)

Since b97a470ffa173d67a9c74373593eea03eb7a2da4 is the last commit in TeamMentor/Master that also exists in TeamMentor/Dev, we are we are going to use as the parent for the patches/branches to apply:

To do so, I started by opening up my local dev repo (currently in sync with the latest commit to Dev) , and executed $ git checkout b97a470ffa173d67a9c74373593eea03eb7a2da4

I then created a tracking branch (called 3.3.3_Release) and added a tag (called v3.3.3), using the commands: $ git checkout -b 3.3.3_Release and $ git tag -a v3.3.3 -m ‘3.3.3 Release’

I then pushed the 3.3.3_Release branch and v3.3.3 tag into the TeamMentor/Dev repository, using the commands: $ git push dev 3.3.3_Release:3.3.3_Release and $ git push dev v3.3.3

Following these commands (and without the pushes that will happen next) we can see the 3.3.3_Release tag in TeamMentor/Dev network graph

Appendix 3) Applying patches

We are now going to apply the patches files (previously created), into the 3.3.3_Release branch of the current local clone of TeamMentor/Dev

Starting with the 0001-Fixing-Issue_142.patch which is a simple change:

To get a preview of what will change when we apply a patch, we can use the command: $ git apply –stat ../_3.4_Patches/0001-Fixing-Issue_142.patch

To see if we are going to have any errors when applying a patch, we can use the command: $ git apply –check ../_3.4_Patches/0001-Fixing-Issue_142.patch

In this case, the fact that we saw no messages on the —check command (shown above), means that we can merge this patch file ok:

… in this case the change was applied on top of our current branch code (with no commit added)

But that has the problem that there was no commit made (just the files changed on disk).

Since we want to preserve the original commit we, will need to can use another command.

First lets reset the current change:

… and before we apply the 0001-Fixing-Issue_142.patch, lets create the Issue_142 branch, using the command git checkout –b Issue_142

Now lets apply the patch this using the command: $ git am –signoff < ../_3.4_Patches/0001-Fixing-Issue_142.patch

…which will add a commit containing the original commit message and author:

Next we push this branch into TeamMentor/Dev

And confirm that the Issue_142 changes are in the correct location (i.e with the b97a470ffa173d67a9c74373593eea03eb7a2da4 commit as its parent):

Note how the light blue line is connected from the b97a470ffa173d67a9c74373593eea03eb7a2da4 commit (see above) into the newly pushed 5319e3028da01c64d09409b833c4f33bc49b7208 commit (see below)

… which is the current head of the Issue_142 branch

The next image shows how we can use GitHub’s UI to create/view the pull request for this branch:

Note how in the screenshot above the Issue_142 branch is 129x commit behind master.

That is caused by the fact that master is currently at the commit 16354b3ec1757f56f0ee1594de3c72bb506f6537 and it should be at the commit b97a470ffa173d67a9c74373593eea03eb7a2da4

See Appendix 5) Creating a 3.4_Release Feature branch and merging branches for how that was fixed.

After mapping the current master commit into a new the 3.5_Release branch and doing a force reset to the master branch, we get the Issue_142 branch correctly set-up with 1x commits ahead and 0x commits behind the master branch:

With TeamMentor/Dev master branch in the correct location, lets apply more patches into it:

For example Issue 51, using the commands:

$ git apply –check ../_3.4_Patches/0001-Fixing-Issue-51.patch (check if patch can be applied)
$ git checkout -b Issue_51 (create patch branch)
$ git am –signoff < ../_3.4_Patches/0001-Fixing-Issue-51.patch (apply patch and preserve original commit) $ git push dev Issue_51:Issue_51 (push branch into GitHub)

This makes the Issue_51 branch to also be 1x ahead and 0x behind commits of the master branch:

With this workflow in place, I quickly did the same workflow for the branches: Issue_384 , Issue_400 , Issue_475 and Issue_459

At the moment we have these branches to merge (Appendix 5) Creating a 3.4_Release Feature branch and merging branches will show them in action):

Note that there were numerous patches (534, 565, 193, 285, 461, 517, 504, 527,462 and 445) that didn’t merge correctly.

For example this is what happened for the 0001-Fixing-Issue-565.patch when executing the command $ git apply –check ../_3.4_Patches/0001-Fixing-Issue-565.patch

These will need to be handled separately (which is a topic for another blog post, since this one is already getting a bit long :) )

Appendix 4) Creating a 3.5_Release Feature branch

In order to make the current TeamMentor/Dev match the TeamMentor/Master in terms of the master branch, we need to move the current master of TeamMentor/Dev into a feature branch called 3.5_Release (in a way we were using the master of TeamMentor/Dev as a ‘feature branch’ which was ok if that code was going to become the 3.4 release (which now it isn’t).

First step is to move into the current master using the command $ git checkout master

Then we create the 3.5_Release feature branch using the command $ git checkout –b 3.5_Release

Next we push this branch into TeamMentor/Dev

At this moment, in the GitHub repo, TeamMentor/Dev’s master and 3.5_Release point to the same commit (16354b3ec1757f56f0ee1594de3c72bb506f6537):

Now comes the sledgehammer :)

We’re going to (first locally) do a hard reset into the b97a470ffa173d67a9c74373593eea03eb7a2da4 commit, using the command $ git reset –hard b97a470ffa173d67a9c74373593eea03eb7a2da4 (remember that this commit is the common one between TeamMentor/Master and TeamMentor/Dev)

After this hard reset, the TeamMentor/Dev master is aligned with the 3.3.3_Release branch and v3.3.3 tag (previously created)

We can also double check this, by using the command $ git gui

… followed by the Visualize all Branch History menu option:

…and see that the Issue_142 branch is now a child of the current TeamMentor/Dev master (which is in sync with the TeamMentor/Master master)

Finally we are ready to apply the sledgehammer to the repository hosted at GitHub, by forcing a push using the command $ git push –f dev master:master

Which makes the TeamMentor/Master look like this:

… with the Issue_142 branch having the master/3.3.3_Release branch as parent (see rouge/brown line)

… and the 3.5_Release branch containing the commits that ware previously in the master branch (see yellow line)

Finally a look at the current branches in TeamMentor/Dev shows that the Issue_142 is correctly 1x commit ahead and 0x behind the master branch (which means that it is ready for a pull request)

Appendix 5) Creating a 3.4_Release Feature branch and merging branches

At this point we have these branches ready to commit (via a pull request)

Instead of merging them into the TeamMentor/Dev master branch, we are going to create a TeamMentor/Dev 3.4_Release branch using the command $ git checkout -b 3.4_Release and push it to TeamMentor/Dev using the command $ git push dev 3.4_Release:3.4_Release

The reason for this branch is so that TeamMentor/Dev master branch is aligned with TeamMentor/Master master branch (which is the current official release), and only QA’d changes are pushed into TeamMentor/Master (first into 3.4_Release branch, and eventually into the official TeamMentor/Master master branch (note that we will most likely rename the TeamMentor/Master repo into TeamMentor/Release)

Next step is to create a pull request from the current Issue_XYZ branches into the 3.4_Release branch.

Let’s start with Issue_459, by clicking on its Compare button:

On the next page, click on Edit:

… to change the base branch (into 3.4_Release):

Then click on the Click to create a pull request for this comparison link

… click on the Send the Pull Request button:

… click on the Merge pull request button

… and the Confirm Merge button:

We could now delete the branch (but I’m not going to do that at this stage, since first I want to see these merged branches in a GitHub Network Graph):

Back into the Branches not merged into master list, although the Issue_459 branch is still 1x ahead of master, we now have the 3.4_Release branch with 2x commits ahead:

The two commits of the 3.4_Release branch are one from the Issue_459 branch and one from the pull request merge (note above how we could now do a Pull request from this 3.4_Release branch into the master branch):

After doing the same workflow for Issue_475 branch:

… the 3_4_Release branch is 4 commits ahead:

And after doing the same workflow for the Issue_142, Issue_51 and Issue_400 branches/issues, the 3_4_Release is 10 commits ahead (with 5 Issues_Xyz applied):

The TeamMentor/Dev graph also shows this workflow in action (note that If I had deleted branches after the pull request, we wouldn’t see the tags in this network graph)

One important note is that the Issue_384 didn’t merge automatically with the 3.4_Release, which means that there is a conflict between one of the changes made by the applied branches and this code (i.e. Michael will need to fix this and resubmit the patch)

Wrapping up: Feedback and better git commands:

If you made it this far to the end, it would be great to have some feedback on this git workflow (and solution).

And if you know of better ways to do solve probs like this one, please ping us with your ideas, since there is still far too much Git functionality that I/we are not aware of.

Example of using GitHub Pull Requests to merge changes made on Branches

After the fixes explained in the Git Flow - Moving patches from one Commit into another Commit post and the reset of the TeamMentor 3.4 branch, Michael reapplied his other changes/fixes to the correct 3.4 commit, and I’m now in the process merging his Pull Requests into the 3.4_Release branch (and eventually into the master branch).

This post walks through my current workflow.

At the moment there are a number of Pull Requests to process:

… which were all created using Git Branches:

In the image above, the top lines show the commits/branches that have already been committed, and the bottom ones the branches that still need to be committed (currently on the ‘open’ Pull Requests)

Git Pull Request workflow

1) open the Pull Request page:

2) click on the link to the issue that is being fixed:

3) read the issue (and its history)

4) back in GitHub’s Pull Request, click on the Files Changed link to see the proposed code changes:

5) if I’m happy with the request, on the ‘Discussion’ tab, I click on the Merge pull request button

… followed by Confirm merge

6) optional: if this was under a repo that I owned, I would also delete the branch, in this case, Michael will have to do it on his repo/fork)

7) optional: confirm on GitHub’s Network Graph that the merge happened ok (i.e. the commit is now on the 3.4_Release branch and the Issue_462 branch no longer is shown on Michael’s fork)

8) optional: check that the respective issue has been correctly tagged/linked with this pull request

… do this for the other Pull Requests….

Here is how the Network Graph looks like after all merges have occurred:

At the moment there are only two branches that need to be merged:

1) : currently conflicting (i.e the merge cannot happen automatically):

2) – no idea what issue this is fixing (the link to the GitHub issue is missing)

Hopefully this shows the power of Git and GitHub’s commit/review workflow where:

  • each bug has a separate Issue, Branch and Pull Request
  • code review of proposed changes is really easy to do
  • multiple fixes can be done in parallel with very view conflicts
  • conflicts (when exist) are easy to indentify and deal with
  • GitHub’s visualizations make a massive difference in making this workflow really smooth
  • everything done by GitGub is based on git commands, which means that all actions could had been done locally, on git clones of TeamMentor/Dev and michaelhidalgo/Dev

In fact, speaking of a manual step, now that we have the 3.4_Release with (just about) the final set of commits, I’m going to merge the 3.4_Release branch with the master branch (which will eventually become the release one)

To do that, I opened a (local) clone of TeamMentor/Dev:

… updated it (since it is out of date with the changes made directly on the GitHub’s version), using the command: $ git pull origin

… see all branches available, using the command: $ git branch –a

… merged 3.4_Release branch into master branch, using the command $ git merge remotes/origin/3.4_Release

… pushed these changes into GitHub, using the command $ git push origin master:master

(note how no files were changed with this push, since all data was already in the 3.4_Release branch, this commit was just saying to GitHub’s version: ‘please point the master branch into the 3.4_Release commit’)

After this commit, GitHub’s network graph will show that the master branch is now at the same commit as the 3.4_Release branch

But we are not done here, we will still need to update the compiled TeamMentor Dlls (and see if any UnitTests broke)

Let’s start by opening up the solution file in VisualStudio 2010:

… then clean and build the solution:

… which succeeded ok:

Next, change the version number to TM 3.4 – Dev 20

And start TM locally (just to see is all looks good):

Now, its time to run all UnitTests (in this case using ReSharper NUnit plugin):

… with two tests failing:

The first one was easy to fix (it was a case of updating the UnitTests to the changes made to the TM_User required fields):

The 2nd one was caused because the Google Analysis file has changed:

Here is the test that does this check:

… which basically checks that the we are using is still the same one served by google (this is a good security practice since TeamMentor’s security is not dependent on Google’s server).

The fix is to update that file:

… and rerun all tests (just to confirm it):

Committing changes made locally.

Keeping up with the model of only doing commits on branches, I quickly created a new branch, using the command: $ git checkout –b 3.4_Dll_Updates

(note how the small changes I made were also marked as ‘Modified (namely the version change, the UnitTests fixes and the recompiled dlls)

… added the files to be committed using the command: $ git add .

… created an commit using the command: $ git commit -m ‘Changing version, adding compiled Dlls, fixing couple UnitTests’

… pushed this branch to GitHub (not 100% necessary, but it will help with the graph), using the command $ git push origin 3.4_Dll_Updates:3.4_Dll_Updates

… applied these changes to the 3.4_Release branch (locally and at GitHub), using the commands:

  • $ git push origin 3.4_Dll_Updates:3.4_Dll_Updates
  • $ git checkout 3.4_Release
  • $ git merge 3.4_Dll_Updates
  • $ git push origin 3.4_Release:3.4_Release

(note that this is an example of a ‘manual Pull Request’)

A quick look at GitHub’s network graph, shows the 3.4_Release branch at the same commit as 3.4_Dll_Updates branch (both one commit behind the master branch)

Finally we update master with these changes, using the commands:

  • $ git checkout master
  • $ git merge 3.4_Release
  • $ git push origin master:master

… and now all branches are at the same level:

Testing QA version created by TeamCity and deployed to Azure:

As shown in past blog posts, we also have TeamCity configured to monitor TeamMentor commits and auto-publish new builds into Azure.

In this case after the latest commit into GitHub TeamMentor/Dev master repo, TeamCity picked up the changes and:
* Built the code * Published to Azure * Run all unit-tests

At the moment there is one unit failing:

… which doesn’t look problematic (it fells like a TeamCity specific case).

The Azure deployment went ok:

With a clean version of TM ready for testing:

…easily populated with a couple libraries:

Running TM locally from Zip File

A final test is to go to the main site and click on the Download Zip button:

…extract the zip files into a local folder, and click on the ‘start TeamMentor.bat’ file:

…which will start the .Net Cassini webserver on port 12120:

(note how the version number matches the commit made earlier)

At this stage:

  • TM is just about ready for a final round of QA
  • TM 3.4 RC1 will be created as soon as:
    • Michael fixes the couple pending issues
    • All UI UnitTests pass
    • All backend UnitTests pass (shown above)

Script to Git Clone 13 repositories in order to have all TeamMentor Libraries in one folder

Part of the push for the 3.4 release of TeamMentor, I wanted to have a copy of all TeamMentor libraries locally (there are 13 libraries on the 3.4 release).

Since O2 Platform’s FluentSharp has native Git support, I was able to do create the clones using this script (note how simple it is to create a clone from a GitHub repo):

 1 var baseFolder = @"E:\TeamMentor\Libraries\SI Library";
 2 var contentRepo  = "{0}.git";
 3 var libraries = new [] { "Lib_PHP", "Lib_CWE", "Lib_iOS","Lib_Android", "Lib_PCI_DSS_Complian\
 4 ce",
 5                          "Lib_.NET_4.0", "Lib_.NET_3.5", "Lib_.NET_2.0","Lib_Java", "Lib_CPP",
 6                          "Lib_Vulnerabilities", "Lib_Scala", "Lib_HTML5" };
 8 var stopWatch = utils.new_Stopwatch();
 9 foreach(var library in libraries)
10 {
11     var gitRepo      = contentRepo.format(library);
12     var targetFolder = baseFolder.pathCombine(library);
13     if (targetFolder.isNotGitRepository())
14         gitRepo.git_Clone(targetFolder);
15 }
16 return "Cloning took: " + stopWatch.stop().minutes_Seconds_And_Miliseconds();  
18 //using FluentSharp.Git
19 //O2Ref:FluentSharp.NGit.dll

The script takes about 1m to run:

And the end result is a folder with all libraries cloned:

With each folder containing the git repository for that library

Next, I zipped all these files into the SI Library – file (note that they all must be on the root of the zip)

Then, on a local QA TM instance, I:

  • went into the admin panel,
  • chose up upload the zip,
  • triggered the installation (i.e. unzip) of those libraries
  • rebuilt the cache:

Once that was completed, a reload of the home page shows the 13 libraries:

Including the new Html5 library:

… and the new Scala library

11. October 2013

  • Fixing the Merge conflict caused by one extra commit on TeamMentor master
  • Enabling GitHub Two Factor Authentication
  • Syncing all releases to the same commit and Tag (for TeamMentor v3.4)

Fixing the Merge conflict caused by one extra commit on TeamMentor master

On the 3.4 Release of TeamMentor (which was the first release we really used Git Flow on development (see this great presentation on Git Branching Model) we ended up with a situation where the commit that was the parent of all feature/fix branches was off-by-one the master of the TeamMentor/Master repository (we also had to do a bunch of back-porting of fixes into that commit, see Git Flow - Moving patches from one Commit into another Commit post)

In practice this means that the TeamMentor/Master graph currently looks like this:

… with the master branch on the commit fe26934d489e65660bd67be7811effcbccad1d19

.. .and the 3_3_3_Hotfix branch on commit b97a470ffa173d67a9c74373593eea03eb7a2da4

But looking at the TeamMentor/Dev Graph

…we can see that all commits (done on ‘one branch per issue’ workflow) have the b97a470ffa173d67a9c74373593eea03eb7a2da4 commit as its parent (see image above and below)

In practice this means that the final 3.4 release commit from the TeamMentor/Dev repo

… is incompatible with the TeamMentor/Master repo (note that these could be branches of the same repo, but I like the use of separate repositories, since they provide a nice air-gap between development and production repositories)

Actually in principle they could be merged automatically if there was no conflicts!

But if we look at that extra commit from TeamMentor/Master repo (the fe26934d489e65660bd67be7811effcbccad1d19 one)

… we see that the change was made on the version number (which in the 3.4 release will now say 3.4)

Note that GitHub will not allow a Pull Request to be made in cases like this, since GitHub has no online merge capabilities.

Ok, so how do we solve this?

The solution is to:

  1. create a local branch pointing to b97a470ffa173d67a9c74373593eea03eb7a2da4
  2. do a pull from TeamMentor/Master to get the fe26934d489e65660bd67be7811effcbccad1d19 commit
  3. merge the current 3.4 code into fe26934d489e65660bd67be7811effcbccad1d19 (which will cause a conflict)
  4. solve the conflict,
  5. commit the result
  6. push to GitHub into a new branch (called 3.4_Merge)
  7. do a pull request (from 3.4_Merge into master)

In a local clone of TeamMentor/Dev we start by to create a branch that is pointing to b97a470ffa173d67a9c74373593eea03eb7a2da4

This can be done using the command: $ git checkout b97a470ffa173d67a9c74373593eea03eb7a2da4

Followed by (as the help says) with: $ git checkout -b 3.4_Merge

Next we do a pull from TeamMentor/Master using $ git pull master:3.4_Merge

The command above is basically saying:

Go to the repo and merge/add the commits from its master branch into the local 3.4_Merge branch

Note how the line _b97a470..fe26934 master -> 3.4_Merge _(from screenshot above) shows how we went form the b97a470ffa173d67a9c74373593eea03eb7a2da4 commit to the **fe26934d489e65660bd67be7811effcbccad1d19 **commit

Next we merge into the 3.4_Merge branch, the contents of the master branch (which contains the 3.4 code) using: $ git merge master

…. which predictably failed with a conflict on Settings.js

Solving git conflicts

My preferred UI to solve conflicts is the one provided by TortoiseGit, which you can access from here:

… them on the popup window that shows up, double click on the conflicted file:

… and on the TortoiseMerge GUI :

… chose the option to Use ‘theirs’ text block

… which will update the bottom pane with the fixed version of Settings.js (in this case with no changes from before)

Save the changes and chose yes to mark the file as resolved:

Close the TortoiseMerge and (since there is no other conflicts) click OK on the Resolve GUI

… another OK:

As the multiple ‘notes’ in the previous UIs mention, we need to commit the changes.

This commit will contain all changes including the conflict fixes

Once the commit is done:

Go back to the Git Bash and push this branch into TeamMentor/Master (I prefer to do these things on a Git Bash)

After the push, this is what the TeamMentor/Master graph looks like:

…with the 3.4 code now being there:

Finally, what we can do now is to issue a Pull Request:

… from the_ 3.4_Merge_ branch:

… into the master branch:

… which contain all the code changes since the 3.3.3 release

With the best part being that this Pull Request can be merged using GitHub’s UI (since there are no conflicts)

And that’s it!

Hopefully this provided a good example of how to use Git and TortoiseGit to easily merge commits and resolve any resulting conflicts.

Tip: How to delete branches in GitHub:

To delete a branch in Github, we do a push from an ‘empty branch’ into an ‘existing branch’

In this case, if I wanted to delete the 3.4_Merge branch at the TeamMentor/Master repository, I would use: $ git push :3.4_Merge

Enabling GitHub Two Factor Authentication

Inspired by Google’s Two Factor Authentication workflow, last month GitHub did the same thing.

I just enabled it, and I strongly recommend that you do it to.

As per the instructions in GitHub’s Two-factor Authentication post, the first step is to go to and click on the Set up two-factor authentication’ button:

… which requires the current password to be entered:

In this case I’m going to use SMS:

Next we enter the phone number and click on Send code:

… enter the number received by SMS and click Enable

And that’s it, two-factor authentication is now enabled:

Creating Tokens to access repos (instead of pwds)

Also great from a security point of view, is that it is also possible to create ‘login tokens’ for Https logins.

This is done one the Applications Settings page:

… where new tokens can be created:

… which can now be used instead of passwords (with the great advantage of being revocable and assignable for a particular use (lets say a particular deployment or app))

I really like this functionality, and hope to eventually add something similar to TeamMentor

Syncing all releases to the same commit and Tag (for TeamMentor v3.4)

This is a bit of house keeping, as you can see by the Fixing the Merge conflict caused by one extra commit on TeamMentor master and Git Flow - Moving patches from one Commit into another Commit posts, not doing this has already cause us some pain in the past.

So after some pushes and pulls (of both commits and tags) I now have the main TeamMentor repos all synchronized at the 72ca4b5d3322901266ca294678cbe15aa343a4b3 commit:

TeamMentor/Release - now the new official home for TeamMentor releases (i.e. the ‘production code’)

TeamMentor/Master – the old TeamMentor official repo (and updated to help migrations into 3.4)

TeamMentor/Dev #1 (before merge 3.5_Release and 3.6_Release merges) – this is the main development clone/fork

TeamMentor/Dev #2 (after merge 3.5_Release and 3.6_Release merges) – note how 3.5 is currently at the same level as 3.4 (see Appendix 1 and Append 2 for how this was done, and how the merge conflicts were resolved)

This can be further confirmed by GitHub’s Branches view, where the 3.5_Release and 3_4_Release branches are synced with master:

… and the 3.6_Release branch is already 131 commits ahead of master (and 3.4_Release and 3.5_Release)

Updating the tags/Releases

I also updated the tags of 3 repos (Release, Master, Dev).

NOTE:** I had to force the update of the tags**, since there was already an v3.4 tag in there (I’m not 100% sure of the side effects of this, but I’m sure I’ll soon find out :) )

Release repo (new production release):

Master repo (legacy/previous production release) (note sure why the GitHub’s layout of this one is different)

Dev repo:

Appendix 1: Updating 3.5 Release branch

This one was easy since the 3.5_Release branch was already synced with an earlier version of the 3.4_Release branch:

The push also confirms that this branch is at 72ca4b5d3322901266ca294678cbe15aa343a4b3

Appendix 2: Updating 3.6 Release branch

Merging the 3.4_Release into 3.6_Release was a bit more problematic due to the number of changes/fixes already done on 3.6 and the backporting of some fixes to 3.4.

After the merge failed, here are the conflicts that needed to be solved:

… with some resolved using the ‘theirs’ strategy

… a few using the ‘mine’ strategy

… and a couple using the ‘theirs first then mine’ strategy (which I expect that will need further fixing in VisualStudio

Finally here is the commit that applies all 3.4 changes to 3.6 (including the merge fixes)

… and the respective push:

12. January 2014

  • How to update a forked GitHub repo (in this case tm-sme/Lib_Vulnerabilities)
  • Updating the GitHub repos for the 1.6.0 release of the Eclipse Fortify Plugin
  • Updating GitHub Forks with latest commits from GitHub’s ‘parent’ repo
  • Using TeamMentor 3.4 TBot admin pages to load and sync a Library hosted on GitHub
  • Using TeamMentor 3.4 TBot admin pages to load and sync UserData with a GitHub hosted repo
  • Adding files to TeamMentor’s web root via a UserData folder (synced with GitHub)

How to update a forked GitHub repo (in this case tm-sme/Lib_Vulnerabilities)

Today I helped to update the tm-sme/Lib_Vulnerabilities repo which is a fork of the TMContent/Lib_Vulnerabilities and is being auto-updated in real-time when changes made to the server (i.e every time there is a content change in there is a server-side git commit, followed by a git pull to tm-sme/Lib_Vulnerabilities (which is a pretty sweet workflow))

The issue we had was how to push the changes from tm-sme/Lib_Vulnerabilities into the TMContent/Lib_Vulnerabilities repo, so that they can be synced back to

Note: this workflow would had been easier if the two repos where in sync, but it happened that there was one commit made to TMContent/Lib_Vulnerabilities (which is the master repo) on the 13th of Dec (d26f385) in between a bunch of updates to the tm-sme/Lib_Vulnerabilities repo (done automatically by TeamMentor). Bottom line: at this stage the repos are not compatible, which is why the GitHub Pull Requests don’t work.

Here are the Git commands I executed locally to merge these repos successfully:

Step 1) Clone repo and try to do a simple pull

1) git plugin$ git clone
2) cd Lib_Vulnerabilities/
3) git remote add upstream
4) git checkout -b mergeBranch
5) git pull upstream master:mergeBranch
which doesn’t work:

! [rejected] master -> mergeBranch (non-fast-forward)_

Step 2) Create a local (forced) copy of the main repo and do the merge locally

6) git checkout -b upstreamVersion

7) git pull -f upstream master:upstreamVersion

8) git checkout mergeBranch

9) git merge upstreamVersion

which works:

1    Merge made by the 'recursive' strategy.
2    LICENSE.TXT | 50 ++++++++++++++++++++++++++++++++++++++++++
3    1 file changed, 50 insertions(+)
4    create mode 100644 LICENSE.TXT

Step 3) push the merged files to both repos

10) git push origin mergeBranch:mergeBranch

1    Counting objects: 7, done.
2    Delta compression using up to 4 threads.
3    Compressing objects: 100% (5/5), done.
4    Writing objects: 100% (5/5), 3.99 KiB | 0 bytes/s, done.
5    Total 5 (delta 2), reused 0 (delta 0)
6    To
7     * [new branch]      mergeBranch -> mergeBranch

11) push upstream mergeBranch:mergeBranch

1    Counting objects: 2971, done.
2    Delta compression using up to 4 threads.
3    Compressing objects: 100% (774/774), done.
4    Writing objects: 100% (2890/2890), 431.68 KiB | 0 bytes/s, done.
5    Total 2890 (delta 2199), reused 2801 (delta 2116)
6    To
7     * [new branch]      mergeBranch -> mergeBranch

Step 4) merge into main branch of main repo

12) … the next step was done on GitHub using a Pull Request on the TMContent/Lib_Vulnerabilities repo

1   git push upstream mergeBranch:master

Step 5) update local master and forked repo master

13) git checkout master

14) git pull upstream master:master

15) git push origin master:master

Here is what the main repo looks after the merge

Updating the GitHub repos for the 1.6.0 release of the Eclipse Fortify Plugin

As you can see by the recent eclipse related posts, I have been working on a Plugin for Eclipse that shows TeamMentor guidance to users that have access to the Fortify Eclipse plugin (and *.fpr files). We are now in the final stages of releasing the first public version (1.6.0) which is actually made of two parts: An Eclipse Plugin builder (which is Open Source) and a small ‘Fortify Specific’ code-mapping script. Very soon these will be in separate projects, but for now they are all hosted at the TeamMentor/TeamMentor_Eclipse_Plugin.

This post is just to document the current GitHub development model and where to find the main parts of this release.

As mentioned above, the master version of the code is at TeamMentor/TeamMentor_Eclipse_Plugin which currently looks like this:

One interesting point here is that for this release I did not use my main GitHub DinisCruz account, but used instead a much less powerful GitHub DinisCruz-Dev account.

To see this in action, note how the Pull Request commits (into the master and develop branch) are made using the DinisCruz account:

… and the development commits are made using the DinisCruz-Dev account:

What I did was to fork into the DinisCruz-Dev account, the TeamMentor/TeamMentor_Eclipse_Plugin repo:

… which was then used during development (which in practice means that the_ DinisCruz-Dev_ account does NOT have commit privileges to the release version of the code base)

In terms of the 1.6.0 release, I also added a Git Tag to it (now possible to do via the GitHub web UI), so that this version can be easily accessed and downloaded from the repo’s Releases page:

In order to help installation and deployment of this plugin in Eclipse, there is also this Eclipse Update site repo TeamMentor/TeamMentor_Eclipse_Plugin_Deploy

… which also contains the v.1.6.0 release tag:

… and can be used inside eclipse using a local clone of this repo, or via this temp update site (see more detailed installation instructions at: TeamMentor Plugin and Builder v1.5.6 (Source Code and Eclipse Update site) ).

Now that this release is out of the way, I will try to write a number of blog posts that show how it works and how powerful the Eclipse Plugin Builder is (for example to add support for more tools or easily create eclipse plugins to help developers to write better/securer code)

Updating GitHub Forks with latest commits from GitHub’s ‘parent’ repo

One of the areas that tend to case some problems with GitHub ‘Forking model’ workflow, is the need to have the Forks updated with the commits that have been added to the Parent repo (i.e. the repo that was used to create the Fork from).

To see real-word examples (and pains) of this issue, take a look at these posts:

For the example show below, I’m going to update the DinisCruz-Dev/TeamMentor_Eclipse_Plugin repo which is a Fork of TeamMentor/TeamMentor_Eclipse_Plugin (here also referenced as the Parent repo).

At the moment these repos are the stage shown in Updating the GitHub repos for the 1.6.0 release of the Eclipse Fortify Plugin :

Here is what this looks like form the Parent’s repo TeamMentor/TeamMentor_Eclipse_Plugin point of view:

And here is what this looks like form the Forked repo DinisCruz-Dev/TeamMentor_Eclipse_Plugin point of view:

Looking at the common commits, might help to visualize what is going on.

The last commit at the Fork is the b4dad953767fcc674d9ca948fc0cfb762415f01c, which can be seen below as represented by the large LAST BLUE dot in the DinisCruz-Dev/TeamMentor_Eclipse_Plugin repo Network graph

But the same commit can be seen below as the large GREEN dot in the TeamMentor/TeamMentor_Eclipse_Plugin Parent repo (note that it is not the last one in this repo):

At the moment the Parent repo is currently at the 2ac007c7385fd992fb5f6c6e4774cfdaaa88ba43 commit (which doesn’t exit in the Forked repo)

The practical consequences of this situation, is that the Fork is currently in an ‘incompatible’ state with its Parent , and it will not be possible to send Pull Requests/code-fixes upstream (note that this is ‘by design’ since Git does not allow merges when there are no common parents).

The solution is to do a Pull Request from ‘Parent to Fork’ (ie. from TeamMentor/TeamMentor_Eclipse_Plugin to DinisCruz-Dev/TeamMentor_Eclipse_Plugin), as seen below:

In this case, there is only one update (made directly on the Parent repo) that needs to be merged into the Forked repo, and more importantly we get the desired 2ac007c7385fd992fb5f6c6e4774cfdaaa88ba43 commit:

Here is how I created the Pull Request:

Which I then opened up in a browser logged in as DinisCruz-Dev (the DinisCruz account doesn’t have GitHub privs to make this merge)

And since the commits are all compatible, I can just click on the ‘Merge pull request’ button:

… to successfully apply the merge:

And now the Forked repo contains all commits that exist in the ‘parent’ repo (note the the 2ac007c7385fd992fb5f6c6e4774cfdaaa88ba43 commit below)

A final good house cleaning step is to also update the Master branch of the Fork:

… which makes the final version of the graph look like this:

Note that all these steps could had been done using the git.exe command line, and in some cases, that is better, since we have more control over the creation of new commits on merge (for example note how every-time I merged a Pull Request in GitHub, a new commit was created! … which is something that is not needed all the time)

For example I prefer when we can align the branches that are synced with the same commit, like what was done on Syncing all releases to the same commit and Tag (for TeamMentor v3.4) and shown below:

Using TeamMentor 3.4 TBot admin pages to load and sync a Library hosted on GitHub

Serge asked me to help making some changes to the TeamMentor’s Asp.NET 3.5 library, and since we need a test server to look at what might be changed (and run some scripts) this is a good time to show about how to use the TeamMentor’s 3.4 Tbot pages to load a Library hosted on GitHub

I will also show, how once the TM server is configured with a library using a Git url, changes can be auto committed/pushed to that Git server, every-time there is a content edition using TM’s web editors.

Step 1: Preparing the target TM server

Lets start with an Azure hosted TeamMentor server, for example this one:

Since we don’t need all those libraries in there (and in fact we want to make changes to the .NET 3.5 Library), lets remove them all.

The easiest way to do it is to change the backend location of the TeamMentor XML files, which can easily be done by going into TBot:

… changing the TMConfig.config file XmlLibrariesPath value:

… and now, after the cache is reloaded:

… there will be no articles on this instance of TM:

TIP: If you want to quickly add a test Library to TM, you can use the old /admin panel option to install Libraries from a link or zip. Here is how I quickly installed the OWASP Library, by clicking on the OWASP link in this page:

… which makes my test TM instance now look like this:

Step 2. Create a Fork of the Target Library

In this case the TM Library I want to make changes to is the private repo [tm-sme/Lib.NET_3.5]( which is the one used by SI’s SME team to make changes into the next version of TM (the idea is that I will make changes in my Fork which I will then issue a Pull Request to this version)

Next, logged in as DinisCruz-Dev (which is my day-to-day GitHub not-very-privileged account), I clicked on the Fork link:

… chose the DinisCruz-Dev as the Fork target:

… and after a couple seconds I had a Fork of the [tm-sme/Lib.NET_3]( repo at [DinisCruz-Dev/Lib.NET_3.5](

Step 3: Configuring TeamMentor to load a Library from GitHub

This next step is a bit different if the IIS user account of the target server is configured to use SSH, but since Azure doesn’t seem to support it, we will need to use HTTPS and hardcoded passwords to do this.

The good news is that GitHub now provides a nice way to create temp hard passwords, so I can use that on this blog post :)

On the GitHub’s Account Settings for the DinisCruz-Dev user:

I clicked on the Create new Token button from the Personal Access Tokens section (part the Applications area)

… entered a name for it (the text in this page provides a good explanation for how this token should be used)

… and after it was created:

… I copied it into the clipboard:

In this case it the token is 124f9ce43f8cecd7f56b2a9e412118b01f72cef7

Back in the [DinisCruz-Dev/Lib.NET_3.5]( main page I clicked on copy to clipboard button that is close to the clone URL:

… which is this case is

Since we need to use this from the Azure instance, we need to hard-code the username and password using the syntax https://{username}:{passwordOrToken}

… which in this case will be

Next step is to go into TBot’s Edit SecretData page:

… enter the Git url in the Libraries_Git_Repositores field, click Save:

… go into the Reload Server Objects, click on the _ Reload Cache_ button

… which should take a little bit (depending on the size of the Library and server-side network connection speed)

.. until a message shows up that says how many libraries and GuidanceItems/Articles exist in the current server (which means that the git clone was successful and the TM server cache was reloaded)

Opening up TeamMentor shows that we now have the new Library installed in this Azure server:

Step 4: Configuring server to auto push commit changes into GitHub

By default changes made on the server are not immediately pushed into the host server.

That behavior can be changed by setting to true the AutoCommit_LibraryData setting from the TMConfig.config file:

Once that is set (and after a server restart or TMConfig settings reload) changes made to TM articles will be auto-commited locally and pushed to GitHub as soon as possible (i.e. there is a bit of a delay in case there are multiple edits going on at the same time).

So see this in action, lets open an article:

… click on the Edit WYISWYG link:

… and make a change to the article (see ‘THIS TEST’ below) and click on the Save Changes button

Once that is done, go back into the GitHub repo site, open the Commits page ([]( and notice that there is an extra commit done just now (note: in TM 3.4 the commit is done under the server IIS user Git settings, which is usually not set, hence the value ‘unknown user’, in TM 3.6 there is already a fix to use the current TM Logged in user)

To really see the ‘real time’ commits and pushes, go back to the TM Article and make another change (this time around I’m using the ‘Notepad’ editor):

After a couple seconds from Saving the changes, another commit will exist in GitHub:

… which is made of the user changes (i.e the diff of the changes made on the TM web interface)

Bonus Feature: Quick restore of an TeamMentor website

Not sure if it obvious by now, but what we have created here is a live version of TM whose content changes are being automatically synced (i.e backed up) into an external Git repository.

This means that if we completely lost the current website (let’s say that Azure went down, or we had another episode of AzureGate - how Azure’s ‘subscription upgrade’ crazy mode caused us to stop using Azure for VM Hosting (and Git+GitHub saved the day) ), we could create a new instance of this TM website by just:

  1. Creating a new Website using the latest release version from
  2. Logging in as Admin and in TBot configure add adding the GitRepo Url to the SecretData’s Libraries_Git_Repositores value
  3. Reload the cache or restart the server

That’s it :)

And with Azure’s APIs, this could all be scripted, which would make it even faster :)

Using TeamMentor 3.4 TBot admin pages to load and sync UserData with a GitHub hosted repo

Continuing from where Using TeamMentor 3.4 TBot admin pages to load and sync a Library hosted on GitHub left, this post shows how to use the same technique to sync TeamMentor’s UserData with a GitHub repo.

For more details on how the UserData repo/folder fits within TeamMentor’s architecture, see these posts:
* Writing RazorSharp script to import TeamMentor users * Creating QA versions of TeamMentor UserData repository, and using branches to show/test the multiple config options * Creating a version TeamMentor which uses the new GitUserData.config file * Practical Example of using Web CSharpREPL in TeamMentor’s development/customizations * Using CSharpRepl to batch change TeamMentor’s users email and settings * Running Customized C# code loaded from TeamMentor’s UserData repository * Using NGit to create native Git support in Azure deployed app (with automatic pushes and pulls)

Step 1: Create UserData repo in GitHub

The first task is to create a Private repo to hold the UserData contents.

Important: Because it will contain sensitive data about the target TeamMentor instance (like password hashes, session IDs, emails, user activity tracking, SMTP account details and encryption key/salt), don’t create a Public repo!

In GitHub, login into the desired account and go to the New Repository page:

In this case I’m going to create the Site_TM_34_QA_Azure repo, where I followed a convention that we have for UserData repos nanes: Site{Url} or _Site{ServerType}_. Which in this case is Site_TM_34_QA_Azure

Step 2: Make sure the repo is created with at least one file

If you chose the option to add a default README file in the previous step, you can ignore this, but if you didn’t you will need to make sure that this repo has at least one branch and one file (or the Git Clone from TeamMentor will be left in a non-working state).

The good news is that you can easily do that from GitHub’s interface.

On the Quick setup section, click on the README me link:

… which will open a web UI where the file can be created:

And after clicking on Commit New File

… the target repo is now in state that can be used by TeamMentor

Step 3: Create a GitHub Personal Access Token to be used to access this account from TM server

As with the previous scenario, that is done on this Admin page

On the resulting page, copy the token (in this case c78fa4d5dcf1b9f521a99396d667a00297734a2b )

This Token will used together with the HTTPS git url ( in the format: http://{username}:{password/token}{GitUser}/{TargetRepo}.git. In this case:

Step 4: Configure TeamMentor Server to use GitHub’s UserData repo:

next open TBot’s Edit GitUserLocation page:

… and enter the Git Url (shown above) in the Git User Location textbox:

Important: In order to keep the UserData up-to-date, it is also needed to set the TmConfig.config’s AutoCommit_UserData value to true

Once that is done, reload the server cache (which will trigger the UserData setup):

Once that is completed, you will notice that you are logged out from TM.

This happened because a new UserData user store was created which didn’t not had any accounts. In those cases TeamMentor server engine will create a default admin account using the details provided in TMConfig.config:

We can also double check on the TBot’s DebugInfo page that the UserData now points to a different folder (note that the folder name is based on the repo name)

We can also confirm in the DinisCruz-Dev/Site_TM_34_QA_Azure repo that the UserData default files have been created (which also confirms that the connection between TeamMentor and Git’s UserData repo is working ok)

Step 5: Create a new user and confirm that it shows up in GitHub

If you look at the Users folder in the GitHub repo, you should see on file in there (which represents the default admin user)

Next, create a new user (using for example the form provided in the home page of the target TeamMentor site):

Once the account is created:

Go back to TBot and reload the UserData (which will trigger a Git Pull and Push of the UserData repo):

Reload the GitHub User’s folder and notice that there are two xml files in there:

A quick look at the commits of this repo, will also show the Commits created by TeamMentor’s backend:

Now logout the admin user and login as the new dinis user:

… reload the UserData objects:

… and note in UserData GitHub’s repo that there are a number of new commits:

For example, here is the ‘Logout user activity log’ for the admin user:

… and here is the UserActivity for the dinis user:

Step 6: Add a Library from a GitHub Repo

In order to make this a working server, we need to update the SecretData config file:

With the location of the Library GitHit repo (see Using TeamMentor 3.4 TBot admin pages to load and sync a Library hosted on GitHub for a detailed explanation of its origin)

Next reload the Cache:

On completion you should see a message with the number of Libraries and Articles in the current server:

And reloading TeamMentor will show the imported Library fully loaded and ready to be used:

Adding files to TeamMentor’s web root via a UserData folder (synced with GitHub)

This post shows how to add custom files to the TeamMentor’s webroot using a special feature of the TeamMentor’s UserData folder.

In this demo I’m going to use the UserData setup in this post (currently synchronised with a GitHub repo)

Basically we are going to edit a file in GitHub, which will end up in the root of the associated TeamMentor website (which is quite a powerful PoC and bug fixing feature).

First step is to go to the synced GitHub repo (created here) and click the Create a new file here button in GitHub’s UI:

… the next page allows us to define a folder name (which needs to be WebRoot_Files if we want these files to the copied to webroot of the current TeamMentor application) :

… and a file name (which can be anything):

… the file contents are added using the GitHub’s text editor:

… and saved using the Commit New File button:

Here is the file added to the GitHub’s UserData repo:

Here is the commit (created by GitHub)

Back in TeamMentor, if we click on the Reload UserData

A server side (to TeamMentor) git pull will occur, and the file added in the GitHub’s UI is now also present in the local TeamMentor’s UserData folder:

Note: in the next version of TM, the Git messages are much better (for example they will show the names of the files affected by a pull)

Here is a simple C# script that confirms that the file is already in the local UserData folder (this script was executed in the C# REPL that is part of TeamMentor’s admin features)

But, at this stage, if we try to open the hello.txt file in a browser, we will see that it doesn’t (yet) exist:

The reason is because the logic that checks for the existence of WebRoot_Files in the current UserData folder, is only executed on server startup or cache reload.

The best solution is to go to the TBot’s Reload Server Object’s page and click on the Reload Cache button:

And once that is done, the hello.txt file will now exist in the TeamMentor’s root folder:

Just to confirm that all is ok, let’s try renaming that file in GitHub:

… to helloAgain.txt

… saving it:

… checking that update commit is there:

… reloading the TeamMentor’s cache:

… and finally confirming that the file has been updated in the live TM server.

Note: Doing this for *.txt file is not that interesting.

Where this technique will really show its power, is when we create *.aspx server-side pages, *.html client-side pages, *.ashx handlers or *.cshtml Razor pages (these last ones will need to be placed inside a special TBot folder).

I will show how this works in one of my next blog posts.

13. February 2014

  • Reverting changes mades to TeamMentor articles

Reverting changes mades to TeamMentor articles

The problem was simple, there were a number of commits made to an TeamMentor GitHub repo that I wanted to completely reverse (without re-writing history).

For reference this happened when I was doing some ‘Link fixing’ tests on a server that was configured to auto commit to GitHub (which meant that the option to do a pure git reset –hard was not available since it would break the TM server)

In this case, the last good commit was e794cc839689dfc7915099d39972abde643a969d and the last bad commit was c53002083e85673f9a4dd7e6dbd2a37bc7ff9e2f (currently HEAD of master)

My first idea was to just do a git revert to the e794cc839689dfc7915099d39972abde643a969d which worked ok locally.

But I struggled to merge it with the master HEAD, because git was being too cleaver , since it realised that these two commits were compatible, and just fast-forwarded into the most recent one (vs doing a ‘reverse merge’)

Based on the this answer from SO’s Revert multiple git commits question, the solution was to

a) clone the target repo and create a test branch

$ git clone ** **$ cd Lib.NET_2.0.git $ git checkout -b mergeTest

b) do a git reset hard into where I wanted to go:

$ git reset –hard e794cc839689dfc7915099d39972abde643a969d

c) then do a git reset soft into the current master

$ git reset –soft a8b755098884173a8f6eced1faddefc0c34a987e

d) use the gitk tool to confirm that the local changes (about to be committed) exist after the current HEAD commit of the mergeTest branch

$ gitk

e) committed the changes

$ git commit -m ‘reverting back to e794cc839689dfc7915099d39972abde643a969d commit’

f) checkout master branch and merged with mergeTest branch

$ git checkout master

$ git merge mergeTest

Updating c530020..a2b3297__Fast-forward__ _
_Attack/2d684518-be94-4454-8d3b-e57f025b0083.xml | 206 +++++++++++++++++++++++++++++++++++++++++++++++++++————————————–__ _
_Attack/36208a74-52f2-4a48-9ecf-4d032d845f2b.xml | 162 ++++++++++++++++++++++++++++++++++++++++——————————
_ Attack/4c053210-1a24-44c2-a3f9-f0cf5008eb3f.xml | 138 ++++++++++++++++++++++++++++++++—————————-_ _ Attack/5af411c1-4606-4f7e-920c-186af71436c5.xml | 130 ++++++++++++++++++++++++++++++————————–_ _ Attack/6ce03806-c25d-4dbc-8df0-3343085d31d0.xml | 74 ++++++++++++++++—————-_ _ Attack/7dfbe9c9-481e-46d2-b1a5-9a776578d6c2.xml | 137 ++++++++++++++++++++++++++++++—————————–_ _ Attack/adf5df06-2b67-4e2a-ace2-6d7060e0bd95.xml | 209 ++++++++++++++++++++++++++++++++++++++++++++++———————————————_ _ Attack/d4b48303-d535-4549-90fc-474b99eff901.xml | 180 +++++++++++++++++++++++++++++++++++++++—————————————_ _ Attack/dcf4e714-d7e2-4c7d-8609-6ab5bd309476.xml | 134 +++++++++++++++++++++++++++++++—————————_ _ 9 files changed, 734 insertions(+), 636 deletions(-)_

f) pushed into GitHub

$ git push origin master:master

g) in the TM website trigger a cache reload (which will also do a git pull from GitHub), opening the DebugInfo page will also show a Git Pull message

h) finally to confirm that all is really the way it should be, I opened up the Pull Request page for the affected repo, and there are now 5 commits , but with 0 files changed:…DinisCruz-Dev:master

Note how in the screenshot above, the last commit is the one done created during this blog post (which reverts the other 4).

There is probably a better way to do this, but the solution described above was the one that made more sense to me (and the one that worked :) )