So if you haven't heard yet VSO Extensions are now in a private preview where you can sign up to get into the preview on extensions integration site. These extensions in the shortest sentence a supported way of doing customizations to VSO that will replace any of the "hacky" extensions that you may be playing around with at the moment like Tiago Pascal's Task Board Enhancer or maybe you have even created your own following similar steps to what I show in my TFS 2013 Customization book.

This post aims to give you a super quick guide on how to get started, you will need to go through the integrations site to really get into detail. It has most of what you will find in most posts but gives you a little something extra that most posts wouldn't have like tips on free stuff Smile

File, New Project

The easiest way to get a basic something in VSO is to just create a new project.

Create/Configure Project

We are going to create a new Type Script project


You should have something like below now


Configure SSL in IIS Express

When you have the VSO Time Ticker project selected head over to the properties window


Change SSL Enabled to True


Take note of the SSL Url that is now available to you.

Add a extensions.json

Let's add a extensions.json manifest file that will be used to inform VSO what our projects actually about


and drop in the content below, replace the baseUri property to include the port you have been assigned for SSL for the project.

"namespace": "VSO-Time-Ticker",
"version": "0.0.1",
"name": "Time Ticker",
"description": "A simple extension for Visual Studio Online of a Time Ticker",
"provider": {
"name": "Gordon Beeming"
"baseUri": "https://localhost:44300/",
"icon": "https://localhost:44300/images/some-icon.png",
"contributions": {
"vss.web#hubs": [
"id": "time",
"name": "Time",
"groupId": "home",
"order": 22,
"uri": "index.html",
"usesSdk": true,
"fullPage": false

Get the SDK

Navigate to GitHub to the samples project and grab the VSS.SDK.js file. Save a copy of that to a scripts folder inside a sdk folder and add it to your project.


Include our App js files

While we here let's build the project, show hidden folders and add the app.js and app.js.map files to the project

If you are using source control you should also at this point undo those files being added source control and then also add them to be excluded otherwise you may get a weird error when it comes time to build your project on a build server (TypeScript : Emit Error: Write to file failed...).
The reason we want these as part of the solution is so that when we do web deploy later they are deployed as well Smile.

Add our app icon

Make a images folder and add a image called some-icon.png to it

Move App js file

Move your App.ts, App.js and App.js.map into a scripts folder. If you have source you might need to re undo and ignore those extra files.


Setup index.html

This is a rather simple step, replace the reference to app.js with one to sdk/Scripts/VSS.SDK.js so it will look something like


Add the following script just inside your body tag

<script type="text/javascript">
// Initialize the VSS sdk
setupModuleLoader: true,
moduleLoaderConfig: {
paths: {
"Scripts": "scripts"

// Wait for the SDK to be initialized
VSS.ready(function () {
require(["Scripts/app"], function (app) { });
So at this stage your full index.html page will look like
<!DOCTYPE html>

<html lang="en">
<meta charset="utf-8" />
<title>TypeScript HTML App</title>
<link rel="stylesheet" href="app.css" type="text/css" />
<script src="sdk/Scripts/VSS.SDK.js"></script>
<script type="text/javascript">
// Initialize the VSS sdk
setupModuleLoader: true,
moduleLoaderConfig: {
paths: {
"Scripts": "scripts"

// Wait for the SDK to be initialized
VSS.ready(function () {
require(["Scripts/app"], function (app) { });
<h1>TypeScript HTML App</h1>

<div id="content"></div>

Update App.ts

In your App.ts file remove the window.onload function and replace it with it's body so your App.ts file will look like below

class Greeter {
element: HTMLElement;
span: HTMLElement;
timerToken: number;

constructor(element: HTMLElement) {
this.element = element;
this.element.innerHTML += "The time is: ";
this.span = document.createElement('span');
this.span.innerText = new Date().toUTCString();

start() {
this.timerToken = setInterval(() => this.span.innerHTML = new Date().toUTCString(), 500);

stop() {


var el = document.getElementById('content');
var greeter = new Greeter(el);

Run App

Running your app with ctrl + F5 you will get a blank app that does nothing Smile


Changed the url to point to the SSL version of your site just to make sure everything is working


Our App is now complete Open-mouthed smile

Install your extension

If you have signed up for the private preview you should see a tab in the admin section of your account called Extensions like so


Click Install, and then Browse 

browse for your extension.json file

Click Open and then OK


Your extension is now installed


View it on VSO

Go to a team project home page and you should now see a Time hub, click on it


Once you land here you will the time Smile


That's 1 extension in the bag but having this run on your local machine is probable not want you would want because nobody else can see it.

Publishing you app

You could buy an SSL certification but that costs a lot and most people don't have that kind of money laying around for fun apps and extensions so we'll turn to Azure. We will now right click on our project and click publish


If you setup an Azure site already you can import the publish settings but I haven't so I'm going to click on Microsoft Azure Web Apps


and then click on New (again if you have a site already you can select it in this list)


Select a name and click Create


it will now take a small bit to setup your azure resource


and then auto magically configure everything you need Smile, click Publish


After the publish is finish your site will launch


Something that you will notice is that this is http but and not https as we said earlier we require. So let's see what happens if we add a s in there Smile


Everything still works Open-mouthed smile.

Last bit of manifest changes

Now that we have a publicly accessible website running on https (for FREE) we can take that url and replace what we currently have in our manifest so it will now look like this

"namespace": "VSO-Time-Ticker",
"version": "0.0.2",
"name": "Time Ticker",
"description": "A simple extension for Visual Studio Online of a Time Ticker",
"provider": {
"name": "Gordon Beeming"
"baseUri": "https://vso-hello-world.azurewebsites.net/",
"icon": "https://vso-hello-world.azurewebsites.net/images/some-icon.png",
"contributions": {
"vss.web#hubs": [
"id": "time",
"name": "Time",
"groupId": "home",
"order": 22,
"uri": "index.html",
"usesSdk": true,
"fullPage": false
Re-install your extension


and refresh your extension in VSO


You will notice now that it obviously still works Smile, if you close Visual Studio and it still works you know it working Smile and I suppose you can check fiddler for where it's reading the files from.


For more info on VSO Extensions visit http://aka.ms/vsoextensions.

A pretty neat getting started post is also on that site at https://www.visualstudio.com/en-us/integrate/extensions/get-started/visual-studio.

Microsoft has a project out on GitHub as well that is quite advanced in the API's that it uses and can be found at https://github.com/Microsoft/vso-team-calendar.

If you want a light overview over everything then you can get their VSO Extension Samples out on GitHub as well using the link https://github.com/Microsoft/vso-extension-samples.

Complete Sample code for this post is also out on Github at https://github.com/Gordon-Beeming/VSO-Time-Ticker


In the new Azure Portal you create all your resources in Resource Groups, there is also as part of the Azure SDK's a module called AzureResourceManager  by default the module loaded for the Azure SDK is AzureServiceManagement. A blurb from one of the Azure documentation page reads

"The Azure and Azure Resource Manager modules are not designed to be used in the same Windows PowerShell session. To make it easy to switch between them, we have added a new cmdlet, Switch-AzureMode, to the Azure Profile module."

Azure Resource Manager Commands

To get a list of all commands that are available for the AzureResourceManager module you can run the command

Get-Command -Module AzureResourceManager | Get-Help | Format-Table Name, Synopsis

this will return






Adds the Azure account to Windows PowerShell


Creates an Azure environment


Clears an Azure profile


Disables the option to directly access to an Azure Sql database (without auditing)


Disables direct access to all Azure Sql databases that use the audit policy of a Sql databa...


Enables the option to directly access to an Azure Sql database (with auditing)


Enables direct access to all Azure Sql databases that use the audit policy of a Sql databas...


Gets Azure accounts that are available to Azure PowerShell.


Filters active directory groups.


Get a group members.


Filters active directory service principals.


Filters active directory users.






Gets information about Data Factory.


Gets information about logical gateways in Data Factory.


Gets information about hubs in Data Factory.


Gets information about linked services in Data Factory.


Gets information about pipelines in Data Factory.


Gets runs for a data slice of a table in Data Factory.


Gets data slices for a table in Data Factory.


Gets information about tables in Data Factory.


Gets Azure environments


Gets the resource types and the Azure data center locations that support them.


Downloads the publish settings file for an Azure subscription.


Gets details about a single cache or all caches in the specified resource group or all cach...


Gets the accesskeys for the specified redis cache.


Gets Azure resources


Gets Azure resource groups


Gets the deployments in a resource group.


Gets resource group templates in the gallery


Gets the deployment log for a resource group


Filters role assignments.


Filters role definitions.


Gets an Azure Sql database's auditing policy.


Gets an Azure Sql server's auditing policy.


Gets Azure subscriptions in Azure account.


Gets predefined Azure tags


Imports a publish settings file that lets you manage your Azure accounts in Windows PowerSh...






Creates a data factory.


Encrypts sensitive data.


Creates a gateway for Data Factory.


Creates a gateway key for Data Factory.


Creates a hub for Data Factory.


Links a data store or a cloud service to Data Factory.


Creates a pipeline in Data Factory.


Creates a table in Data Factory.


Creates a new redis cache.


Regenerates the access key of a redis cache.


Creates a new resource in a resource group


Creates an Azure resource group and its resources


Add an Azure deployment to a resource group.


Create a role assignment to some principals at a given scope.


Creates a predefined Azure tag or adds values to an existing tag


Deletes an Azure account from Windows PowerShell.




Removes a data factory.


Removes a gateway from Data Factory.


Removes a hub from Data Factory.


Removes a linked service from Data Factory.


Removes a pipeline from Data Factory.


Removes a table from Data Factory.


Deletes an Azure environment from Windows PowerShell


Remove redis cache if exists.


Deletes a resource


Deletes a resource group.


Removes a role assignment.


Disables an Azure Sql database's auditing.


Disables auditing of all the databases that rely on the auditing policy of the given databa...


Deletes an Azure subscription from Windows PowerShell.


Deletes predefined Azure tags or values


Resumes a suspended pipeline in Data Factory.


Downloads log files from HDInsight processing.


Saves a gallery template to a JSON file


Changes the current and default Azure subscriptions




Sets the description for a gateway in Data Factory.


Configures the active period for data slices.


Sets the status of slices for a table in Data Factory.


Changes the properties of an Azure environment


Set redis cache updatable parameters.


Changes the properties of an Azure resource.


Changes the properties of a resource group


Sets an Azure Sql database's auditing policy.


Sets an Azure Sql database server's auditing policy.


Creates or changes an Azure subscription


Cancels a resource group deployment


Suspends a pipeline in Data Factory.


Switches between the Azure and Azure Resource Manager modules


Detects errors in a resource group template or template parameters


Marks an Azure Sql database as using its server's auditing policy.

Step 0

As a step 0 lets open up everything we need.

Azure SDK

So to get started you need to install the Azure SDK which you can get from the SDK downloads page. I am using Azure SDK 2.5 version for this post.

PowerShell ISE

Open the PowerShell ISE using Win + R and then %WINDIR%\system32\WindowsPowerShell\v1.0\powershell_ise.exe. You can also use the standard PowerShell window if you want.

Azure Management Portal

Open and sign in to the Azure Management Portal (https://manage.windowsazure.com/)

Azure Portal

Open and sign in to the Azure Portal (http://portal.azure.com/)

Step 1

Let's start by getting the Azure Management Portal pieces out the way. In the Azure Management Portal we will just be creating a new AAD user that we can use to automatically login through PowerShell. If you want to use a MSA just leave the credentials bit off in Step 3 and you will receive a prompt for credentials at which time you can use MSA or AAD credentials and you can now move to step 2. If you want to create the new user follow my other post Creating a new Azure Active Directory User to create a user for this demo.

Step 2

In the PowerShell ISE we will kick off by switching the Azure SDK to use the resource manager module.

Switch-AzureMode -Name AzureResourceManager

Step 3

After we have switched to the AzureResourceManager module we are able to use the commands that are part of it. Let's start off by adding our Azure Account we just created using the snippet below (I keep my username and password in txt files and reference from multiple sample scripts for ease of use but you can place them straight in the script if you wanted

[string]$currentUsername = Get-Content "Z:\_PowerShell\Azure\currentUser.txt"
[string]$currentPassword = Get-Content "Z:\_PowerShell\Azure\currentPass.txt"
$secpasswd = ConvertTo-SecureString $currentPassword -AsPlainText -Force
$mycreds = New-Object System.Management.Automation.PSCredential ($currentUsername, $secpasswd)
Add-AzureAccount -credential $mycreds


This would then return the account added


At this point if you ran Get-AzureAccount it will show you this account and any others you have previously added.


Step 4

Next we'll explore some meta data for Step 5.

Getting a Resource Group Template

We can run the command


and it will return a list of all the current resources group templates that we can create. For the list at the time of writing this post you can refer to one of my GitHub Gists at https://binary-stuff.com/gist/ea0884f5ba00c62a83e4. We are going to be using one of the templates found around line 2220 which is the website and sql database.


Get a list of Azure Resource Locations

When creating Azure Resources you need to specify were those resources are located geographically. Now you could guess or Bing what the locations are or you could use the handy command


which will return you a list of every location that every resource is available in. This list as with the one above can be found as one of my GitHub Gists at https://binary-stuff.com/gist/87aad8bfbbbcd7421b83. From the long list of locations we are just going to use West Europe for all our resources

Step 5

At this point we have all the info we need to create our resources using the resource manager in powershell so we'll use the snippet

$ResourceManagerTest = "ResourceManagerTest"
$AzureDataCenterLocation = "West Europe"
$administratorLoginPassword = ConvertTo-SecureString "$($ResourceManagerTest)DbP@ssw0rd" -AsPlainText -Force
New-AzureResourceGroup -Name "$($ResourceManagerTest)" `
-Location "$AzureDataCenterLocation" `
-GalleryTemplateIdentity Microsoft.WebSiteSQLDatabase.0.2.2-preview `
-siteName "$($ResourceManagerTest)Site" `
-hostingPlanName "$($ResourceManagerTest)Plan" `
-siteLocation "$AzureDataCenterLocation" `
-serverName "$($ResourceManagerTest.ToLowerInvariant())dbserver" `
-serverLocation "$AzureDataCenterLocation" `
-administratorLogin "$($ResourceManagerTest)DbLogin" `
-administratorLoginPassword $administratorLoginPassword `
-databaseName "$($ResourceManagerTest)DbName" `

This will then go off and create our website and database along with an Application Insights resource that we can use for our website when we deploy it. When the command finishes you should see an output similar to the one below which because we specified the -verbose flag tells us the status of each resource created


At this point we really are finished with what the subject of the blog post is but we'll continue on to explore what we have just created, how we would have had to create it using the Azure Portal and how we can remove the resource group using PowerShell.

Step 6

Open the Azure Portal. The first thing you should notice is that you already have a notification and when you open that it says that a deployment has succeeded.


Clicking on the success notification will open that resource group


From here you can use the resources 100% as if you created them in the portal.

Step 7

Before we see how we would have had to create those resources manually let's remove the resource group from our subscription. To do this we run the simple command below keeping in mind that the $ResourceManagerTest variable should still be set from the previous command in step 5

Remove-AzureResourceGroup -Name $ResourceManagerTest -Force -Verbose

this will then proceed to remove the resource group, again because of the -verbose flag we don't get as much info but rather just that it's deleting the resource group and then comes back when it's done.


Step 8

The last thing that I'll show on this post is how we would of have to do this manually (or at least where to find template). Back in the Azure Portal click new in the bottom left corner and then click on Everything


Next click on the Web category/section and then you'll see the Website + SQL option which is what we created


clicking on that option will give you a little info about the template and from here you'd just click on create


From here you will configure the Resource Group Name, all the settings for your Website resource and SQL resource and then click create.


When that process completes you will be in the sample place as that small PowerShell script gets you too Smile

Thoughts and comments

If you have any thoughts or comments about any of the pieces of this post please do share below.


If you've been using Application Insights for a while now you would have noticed that with recent Visual Updates the Application Insights SDK had been updated to use a new version that logs AI data into the new Azure Portal. From this if you use Windows Store Apps you would probably have noticed that if you instrument a Windows Store application with the new SDK that you can't actually find your data anywhere although you can see it being logged from the Visual Studio Output window and using Fiddler. The good news is that you will soon be able to have a great experience with the new AI SDK for Windows Store apps as you do for your currently for your web applications, the better news is that below you'll see how to use the new SDK for your Store Apps and see the data in the Azure Portal.

Create a new AI Resource

First off head over to the Azure Portal and create a new Application Insights Resource


Give it a name and choose a Resource Group, It's fine that the Application Type is set to ASP.net web application.


After the resource has been created you are ready to add AI to your store app in visual studio

Adding AI to your Windows Store App

Open your store app in visual studio and make sure you have the latest AI SDK installed (2.4 at the moment) and no other AI artifacts left in your project. Right click on the store app solution and click on Add Application Insights Telemetry...


At this point if you haven't signed in you will be promoted to sign in, click Gain insights into your application now button


After you have signed in you will be presented with the Add Application Insights to Project window


On this window click on Use advanced mode


which will now ask you for all your AI settings


We'll get all of these from the portal. Open your AI resource in the portal and click on Properties


This will show you your Subscription ID, Resource Group, Application Insights Resource (name) and Instrumentation Key. Copy these and place them in the window. Once you have filled in all of those fields and clicked Add Application Insights To Project you are ready to run your app and see some data. 


For now it's good enough to see what's happening in your app and some insight is infinity better than no insight =)


I use to make a lot of TFS customizations and had to apply the template changes to multiple team projects which took a bit of time. Depending on the method you use it could be a quick or loooong process Smile. When I first started doing customizations I used the TFS Power Tools to upload changes which is a lot of effort because you are uploading one work item definition at a time into one team project.

Using Command Line

After a while I started using command line (witadmin importwitd), this was slightly faster but I found myself keeping a list commands in a txt file and then searching for the one I need when needed and run it.

A Basic PowerShell Script

I follow Martin Hinshelwood on various social media and one day he posted a blog post titled Upgrading to Visual Studio Scrum 3.0 process template in TFS 2013, although I had been at this point playing a lot with upgrading from TFS 2012 to TFS 2013 there was one piece of magic in that post that changed the way I applied process template changes up until today. It was a script that simple looped through the work item definitions in a set folder and imported them into TFS

[string] $CollectionUrlParam = $(Read-Host -prompt "Collection (enter to pick):"),
[string] $TeamProjectName = $(Read-Host -prompt "Team Project:"),
[string] $ProcessTemplateRoot = $(Read-Host -prompt "Process Template Folder:")

$TeamProjectName = "teamswithareas"
$ProcessTemplateRoot = "C:\Users\mrhinsh\Desktop\TfsProcessTemplates\Microsoft Visual Studio Scrum 3.0 - Preview"
$CollectionUrl = "http://kraken:8080/tfs/tfs01"

$TFSConfig = "${env:ProgramFiles}\Microsoft Team Foundation Server 11.0\Tools\TFSConfig.exe"
$WitAdmin = "${env:ProgramFiles(x86)}\Microsoft Visual Studio 12.0\Common7\IDE\witadmin.exe"

witds = Get-ChildItem "$ProcessTemplateRoot\WorkItem TrackingType\Definitions"

foreach ($witd in $witds)
Write-Host "Importing $witd"
& $WitAdmin importwitd /collection:$CollectionUrl /p:$TeamProjectName /f:$($witd.FullName)
$WitAdmin importcategories /collection:$CollectionUrl /p:$TeamProjectName /f:"$ProcessTemplateRoot\WorkItem Tracking\Categories.xml"
$WitAdmin importprocessconfig /collection:$CollectionUrl /p:$TeamProjectName /f:"$ProcessTemplateRoot\WorkItem Tracking\Process\ProcessConfiguration.xml"

Small Script Evolution

This worked for a while but I still had keep a couple of PowerShell files for the different projects I want to import the process templates into. I ended up adding over the next while adding a couple of additions to the script like publishing new global lists

#if there is a file with the name GlobalLists-ForImport.xml import it as Global List info for the current collection
if (Test-Path "$ProcessTemplateRoot\GlobalLists-ForImport.xml")
Write-Host "Importing GlobalLists-ForImport.xml"
& $WitAdmin importgloballist /collection:$CollectionUrl /f:"$ProcessTemplateRoot\GlobalLists-ForImport.xml"

and imported linked types

#import each Link Type for the $CollectionName
foreach($witd_LinkType in $witd_LinkTypes)
Write-Host "Importing $($witd_LinkType.Name)"
& $WitAdmin importlinktype /collection:$CollectionUrl /f:$($witd_LinkType.FullName)

ALM Rangers - vsarUpgradeGuide & vsarSAFe


The first project I joined after joining the ALM Rangers was the TFS Upgrade Guide. The last part of my contributions for the upgrade guide was a PowerShell script that could help you easily upgrade your process templates (or at least publish them) after you have made the changes required to make them compatible with TFS 2013. And for some reason it wasn't until then that I made the script target multiple team projects in the same collection.


The latest small modifications that were made to the script were for the project vsarSAFe which looks at how to modify your process template to make them SAFe aware. If you aren't familiar with SAFe it stands for Scaled Agile Framework. As I'm writing this we are showing up as Delayed on the Flight Plan but will be landing soon Open-mouthed smile


Most of the changes included here were just around adding comments and cleaning the script up a bit to make it easier to read.

So what does the script look like?

The final script (as it is now) looks like below

# Copyright © Microsoft Corporation.  All Rights Reserved.
# This code released under the terms of the
# Microsoft Public License (MS-PL, http://opensource.org/licenses/ms-pl.html.)
$server = "MyTfsServer"
$port = 8080
$virtualDirectory = "tfs"
$CollectionName = "DefaultCollection"
$TeamProjectNames = @("Team Project 1", "Team Project 2", "Team Project 7", "Sample Scrum Project 1")
$ProcessTemplateRoot = "C:\templates\Microsoft Visual Studio Scrum 2013.3"

$CollectionUrl = "http://$($server)$(if ($port -ne 80) { ":$port" })$(if (![string]::IsNullOrEmpty($virtualDirectory)) { "/$virtualDirectory" })/$($CollectionName)"
$API_Version = "12.0"

# don't edit below this line

#get a reference to the witadmin executable path for the current api version
$WitAdmin = "${env:ProgramFiles(x86)}\Microsoft Visual Studio $API_Version\Common7\IDE\witadmin.exe"

#if there is a file with the name GlobalLists-ForImport.xml import it as Global List info for the current collection
if (Test-Path "$ProcessTemplateRoot\GlobalLists-ForImport.xml")
Write-Host "Importing GlobalLists-ForImport.xml"
& $WitAdmin importgloballist /collection:$CollectionUrl /f:"$ProcessTemplateRoot\GlobalLists-ForImport.xml"

#get a reference to all work item type definitions
$wit_TypeDefinitions = Get-ChildItem "$ProcessTemplateRoot\WorkItem Tracking\TypeDefinitions\*.*" -include "*.xml"

#get a reference to all work item link types
$witd_LinkTypes = Get-ChildItem "$ProcessTemplateRoot\WorkItem Tracking\LinkTypes\*.*" -include "*.xml"

#import each Link Type for the $CollectionName
foreach($witd_LinkType in $witd_LinkTypes)
Write-Host "Importing $($witd_LinkType.Name)"
& $WitAdmin importlinktype /collection:$CollectionUrl /f:$($witd_LinkType.FullName)

foreach ($TeamProjectName in $TeamProjectNames)
Write-Host "Upgrading $TeamProjectName."

#import each Type Definition for the $TeamProjectName
foreach($wit_TypeDefinition in $wit_TypeDefinitions)
Write-Host "Importing $($wit_TypeDefinition.Name)"
& $WitAdmin importwitd /collection:$CollectionUrl /p:$TeamProjectName /f:$($wit_TypeDefinition.FullName)

#import work item categories for the $TeamProjectName
& $WitAdmin importcategories /collection:$CollectionUrl /p:$TeamProjectName /f:"$ProcessTemplateRoot\WorkItem Tracking\Categories.xml"

#import work item process configuration for the $TeamProjectName
& $WitAdmin importprocessconfig /collection:$CollectionUrl /p:$TeamProjectName /f:"$ProcessTemplateRoot\WorkItem Tracking\Process\ProcessConfiguration.xml"
Write-Host "Done upgrading team projects"

This script now targets unlimited team projects in 1 team project collection, updates the categories, configuration, global lists and link types. You can grab the script off GitHub as well under my Gists (upgrade-tfs-2013-process-templates.ps1).

This takes care of all the things I need when making process template changes as I now make what ever changes I need run the script and check my changes in the browser. It doesn't get much easier than this but if you have a easier way do let me know Smile.


So you have a project that uses Application Insights and want to share it with the world. Sure you just post it to CodePlex, GitHub or some other place that allows publishing of code don't you.


Something that you don't want to do is share your code without removing some of the values under the ComponentSettings node in your ApplicationInsights.config. I see 3 possible ways of doing this, possibly each being used in a specific scenario.

1.) Delete and publish

When: I foresee this scenario being for when you have not made any custom changes to your Application Insights configuration.

The first way is to simple delete the Application.config and publish the source code.


This will mean that when somebody that downloads your code wants to run it and use the Application Insights bits they will need to add new Application Insights configuration


After they have done that they can use your solution with Application Insights no problem

2.) Blank out or token the ComponentSettings section

When: I see this option being for when you have made modifications to the Application Insights config that you feel other users would need in order to effectively use App Insights in the application.

You could for example specify that for the Development Profile you don't want to collection user and machine names.


The pieces you will want to blank out or token are below

  • ComponentSettings\ComponentId
  • ComponentSettings\DevelopmentMode\ComponentId
  • ComponentSettings\DevelopmentMode\PortalURI
  • ComponentSettings\DevelopmentMode\DashboardId
  • ComponentSettings\AccountId
  • ComponentSettings\LicenseKey


After you have done this you are good to share your code Smile

3.) Split source control

When: You want to share the code but also want to use the application for "real", i.e.: Windows Store Application

What I have started doing for applications that I want to share the source for but also want to use the app in the real world is connecting to a public source control like CodePlex or GitHub and then also to private source control like VSO. The way I do it is to have a public GitHub repo that contains all the logic for my app, this repo is then pushed into a VSO repo as well where I do all my Application Insights stuff. Using this method allows me to work on my app normally as I would with source control and allows the community to be evolved as well with them having to worry about App Insights configuration (sometimes you would want them to be adding App Insights to the code as they add it in which case you will just keep the config separate) and then for when I publish to the store I don't have to have to add a bunch of App Insights code everywhere and keep it out of source control. All I need to do is push to my VSO account, merge any conflicts (which shouldn't happen if I'm just doing App Insights in VSO) and then possible add any additional telemetry that I want and then finally I publish to the store and everything is awesome Open-mouthed smile.


These are just my thoughts and there are probably other ways that people currently do it. One thing you don't want to do is share you keys because then you need to reset them and then update all applications in the wild connected to your account.

if you have other ways of handling this today give me a shout at @GordonBeeming with some details Smile