Wednesday, May 13, 2015

Beware where you are POSTing!

Recently I had the pleasure to work with Highcharts, a Javascript library for creating dynamic diagrams. Recommended!

The client also wanted the ability to download the data that is used in the diagrams as a CSV file. A quick browse in the documentation learned that Highcharts supports this scenario. There are multiple ways to do this but the one I've seen the most involves POSTing your diagram data to a page that resides within the Highcharts domain.

That csv.php page only adds the headers to create a download:
This means that if you use this construction all your diagram data will be passed to a page that is within the control of Highcharts. Remember, I'm not claiming that Highcharts will do anything malicious with your data!
On the contrary, Highcharts even advises in their documentation that you should create your own page if you don't want to expose your data. Not to mention that they explicitly tell you that the page could disappear at any moment.

However, a quick search on Github learned that a number of projects are still using the Highcharts csv.php page meaning that all their data will be posted to another party (over HTTP as well).

So kids, whenever you start to POST data to a third party, ask yourself if you don't mind that the data being posted is now potentially public. And create your own page to handle that download.

In ASP.NET MVC it is as simple as creating the following controller method:

Small tip: use QueueBackgroundWorkItem for asynchronous work in ASP.NET

This is a small tip that I'm mainly publishing as a reminder to myself, but it could come in handy for someone else.

Background processing tasks in ASP.NET are hard. At any time IIS could decide to recycle the application lifecycle. The usual solution is to farm out these tasks to a (Azure) queue and let some other machine (for example an Azure worker role) process that queue.

However, with ASP.NET 4.5.2 Microsoft introduced the QueueBackgroundWorkItem method. This makes it possible to create small background processing tasks within the application lifecycle context.

See the following (extremely simple) example:

  • A task started this way will only delay the recycling of the app pool for 30 seconds. So you need to complete your work within those 30 seconds. If not, the task will be killed.
  • You need ASP.NET 4.5.2. 

See for more detail the following links:

Wednesday, April 17, 2013

Using NUnit for your tests in Team Foundation Service with a Git repository

Since quite some time now it is possible to create a new project in Team Foundation Service (the cloud variant, not to be confused with Team Foundation Server) using Git for your version control. Since I'm an avid Git user I decided to see how the newly released Git tooling would work within Visual Studio and how easy it would be to migrate a project I've hosted on Github to Team Foundation Service.

I cloned my repository that I'm currently hosting on Github. This project already contains a MSBuild file for building the entire project. It also contains some unittests, written using NUnit.
Cloning the new project using Git
  • Great, now I have an empty local repository. Now to fill that repository. Using the commandline I went to the location of the new, empty repository. Then I simply used Git to retrieve all changes from my 'old' local Github repository with the command:

    git pull <location of local github repository> master --force

  • After that, it is simply a matter of pushing your changes to TFS.
Pushing the changes back to TFS.
  • Okay, now I had a filled repository on TFS. Next step was to create a builddefinition. I created a new definition and let it point to the build.proj MSBuild file that is found in the root of my repository. Queue new build and some moments later:
Build result

Tadaa! Using my own MSBuild file in the repository running in the build service of Team Foundation Service  resulted in a valid build. However, do you notice that warning icon at the bottom? It says: "No test is available in C:\a\bin\UnitTests.dll. Make sure that installed test discoverers & executors, platform & framework settings are appropiate and try again".
So it looks like my tests aren't being run.

I'm using NUnit for my unittests since I prefer it above the default MSTest framework. Team Foundation Service only has out of the box support for MSTest. However, Microsoft did something cool. They made it possible to use multiple testing frameworks in Visual Studio 2012.

Another testing framework only has to write an adapter interface and then it can be used by Visual Studio 2012. The plugin for NUnit can be found here. It is an extension for Visual Studio 2012.

Okay, so that gives me local support in Visual Studio to use NUnit. Now I want to add that testing functionality to the build server on my Team Foundation Service.

Luckily there is a way. You can configure the build controller to load additional assemblies to use. Simply use the following instructions to add the correct assemblies to the build controller.

One warning: the dialog on the build controller that asks you to specify the location of the custom assemblies, only gives you the possibility to retrieve those assemblies from a TFS repository. So Git is not yet supported in that way.

Simply create a new empty project using TFS as version control and add the assemblies to a location. Then configure your buildcontroller to use those assemblies. My configuration looks like this:

After that, you can run the build again and now TFS will start using NUnit for the tests in your automated build. The end result:

Note that this technique also works with other testing frameworks like xUnit.

Hope this helps someone. Happy hacking!

Tuesday, February 19, 2013

Hoe je een Google Nexus 10 in het buitenland koopt

In Dutch since this article only makes sense to Dutch readers. Sorry international audience ;-)

 Ik had al enige tijd mijn zinnen gezet op een Google Nexus 10. Google brengt echter de Nexus 10 niet in Nederland uit. Nederland is blijkbaar niet zo belangrijk voor Google. Diensten als Google Music zijn hier in ons kikkerlandje ook (nog) niet beschikbaar.

Wat te doen? Kopen in het buitenland! In Duitsland en in de UK is de Nexus 10 wel beschikbaar. Ik zou naar Duitsland kunnen rijden en daar proberen of ik een Nexus 10 kon scoren. Helaas hoorde ik van meerdere mensen (en wat zoeken op het internet) dat er praktisch geen winkels zijn met de Nexus 10 op voorraad.

Via Google Play kun je direct bij Google bestellen. Het nadeel daarvan is dat je als Nederlander daar de Nexus 10 niet zult aantreffen.... Tenzij je onderstaande instructies opvolgt.

Je hebt nodig:

  1. Een buitenlands afleveradres waar je bestelling naartoe gestuurd kan worden. Ik koos ervoor om te bestellen in de UK. Ik maak gebruik van Forward2me.
  2. Een IP adres waarmee Google denkt dat je in de UK zit. Hiervoor kun je prima gebruik maken van diverse VPN diensten. Ik gebruikte BestUKVPN.
  3. Een creditcard
  4. Je eigen Google account.
Wat moet je doen?
  1. Log in op de VPN.
  2. Log in op je Google Play account. Google denkt nu dat je in de UK bent. Je kunt nu je Nexus 10 bestellen.
  3. Voer als afleveradres het adres in wat je met Forward2me hebt aangemaakt.
  4. Zodra het tijd wordt voor betalen voer je je creditcard gegevens in. Let erop dat je als adres voor je creditcard ook het eerder aangemaakte Forward2me adres gebruikt.
  5. Rond de bestelling af.
Hierna is het een kwestie van wachten. Het duurde bij mij ongeveer een week voordat de Nexus 10 was opgestuurd naar Forward2me en van daaruit werd verzonden naar Nederland.

Thursday, November 22, 2012

Using Dropbox or Google Drive as a backup for your Git repository

I’m currently working on a small hobby project in my spare time. I’m using Git as my DVCS. Most of the development is done on my laptop. And I would like to have a backup of my repository in a secure, remote location.

Sure, I could use Github to host my repository (and I plan to publish it there in the future), but for now my code is not yet stable enough to show it to the rest of the world. I’m a strong believer that you should have something useful before showing it to the rest of the world. I dislike open source projects that do not work out of the proverbial box.

Dropbox or Google Drive can help with that. When you install the client an extra folder is created on your system. All files you copy to this folder will be synced to Dropbox/Google.

This in combination with Git makes for a perfect, cheap back-up solution. I started by cloning my Git repository in the Dropbox folder.

Here I created a bare (without working directory) clone of my Git repository in the Dropbox folder. Dropbox will now automatically start syncing this. Next, I define this new clone as a remote for my local repository. I use the name ‘dropbox’ for the remote.

Now, whenever I’m done working in my local repository I just do:

This pushes all my changes in master to the remote repository in the Dropbox folder. Dropbox will automatically sync it to my account. I can install the Dropbox client on any other computer, retrieve the repository from Dropbox, clone it and start hacking.

Beware however that this is not a completely foolproof back-up solution. You could get in trouble whenever you push to Dropbox and switch off your computer before Dropbox had a chance to sync everything. If you then try to retrieve the Dropbox repository on a new computer you will probably have a corrupted Git repository.

However it works for me as a cheap extra back-up.

Tuesday, May 8, 2012

Why you should use git merge --no-ff when rebasing.

When working with Git my normal workflow was to create my feature/bugfix in a branch. When I'm ready to integrate it, I rebase the branch and merge it into master. This ensures that the master branch has a linear history.

This gives you a clean history which is easy to bisect.
An example of linear history
However, when reading the help for git merge I came across the --no-ff parameter. It says:
Create a merge commit even when the merge resolves as a fast-forward.
I decided to use it for my 0.2 version of my software. This created the following graph in gitk:
Branche rebased and merged with git merge --no-ff
This code was first rebased, then merged with git merge --no-ff. An explicit merge commit is now created (where the tag 0.2 points to). Because the branch was already rebased, this merge commit does not contain a diff - it only exists to merge the two branches together.

This gives you several advantages on top of the advantages you have using git rebase:
  1. You can see the name of the branch you merged back into master.
    The name is part of the commit message of the merge commit.
  2. In the graph you can easily see what commits were part of the branch. 
    This comes in handy when you later want to see what individual commits were needed to create a specific feature branch.
  3. You are now able to revert the entire branch by reverting the merge commit with git revert.
    Beware that the git revert commit command needs to know that you are reverting a merge. In the example above I would revert it with 'git revert -m 1 0.2'. See the git man page for details.
Enough advantages for me to keep on using the rebase workflow but to merge it with an explicit merge commit.