0 Comments

So if you haven't heard yet VSO Extensions are now in a private preview where you can sign up to get into the preview on extensions integration site. These extensions in the shortest sentence a supported way of doing customizations to VSO that will replace any of the "hacky" extensions that you may be playing around with at the moment like Tiago Pascal's Task Board Enhancer or maybe you have even created your own following similar steps to what I show in my TFS 2013 Customization book.

This post aims to give you a super quick guide on how to get started, you will need to go through the integrations site to really get into detail. It has most of what you will find in most posts but gives you a little something extra that most posts wouldn't have like tips on free stuff Smile

File, New Project

The easiest way to get a basic something in VSO is to just create a new project.

Create/Configure Project

We are going to create a new Type Script project

 New_Project_2015-06-18_20-08-23

You should have something like below now

2015-06-18_20-09-35

Configure SSL in IIS Express

When you have the VSO Time Ticker project selected head over to the properties window

2015-06-18_20-17-56

Change SSL Enabled to True

2015-06-18_20-18-36

Take note of the SSL Url that is now available to you.

Add a extensions.json

Let's add a extensions.json manifest file that will be used to inform VSO what our projects actually about

Add_New_Item_-_VSO_Time_Ticker_2015-06-18_20-11-47

and drop in the content below, replace the baseUri property to include the port you have been assigned for SSL for the project.

{
"namespace": "VSO-Time-Ticker",
"version": "0.0.1",
"name": "Time Ticker",
"description": "A simple extension for Visual Studio Online of a Time Ticker",
"provider": {
"name": "Gordon Beeming"
},
"baseUri": "https://localhost:44300/",
"icon": "https://localhost:44300/images/some-icon.png",
"contributions": {
"vss.web#hubs": [
{
"id": "time",
"name": "Time",
"groupId": "home",
"order": 22,
"uri": "index.html",
"usesSdk": true,
"fullPage": false
}
]
}
}

Get the SDK

Navigate to GitHub to the samples project and grab the VSS.SDK.js file. Save a copy of that to a scripts folder inside a sdk folder and add it to your project.

2015-06-18_20-27-15

Include our App js files

While we here let's build the project, show hidden folders and add the app.js and app.js.map files to the project

2015-06-18_20-29-042015-06-18_20-29-58
If you are using source control you should also at this point undo those files being added source control and then also add them to be excluded otherwise you may get a weird error when it comes time to build your project on a build server (TypeScript : Emit Error: Write to file failed...).
2015-06-18_20-33-152015-06-18_20-33-49
The reason we want these as part of the solution is so that when we do web deploy later they are deployed as well Smile.

Add our app icon

Make a images folder and add a image called some-icon.png to it
2015-06-18_20-44-48

Move App js file

Move your App.ts, App.js and App.js.map into a scripts folder. If you have source you might need to re undo and ignore those extra files.

2015-06-18_20-51-48

Setup index.html

This is a rather simple step, replace the reference to app.js with one to sdk/Scripts/VSS.SDK.js so it will look something like

2015-06-18_20-49-40

Add the following script just inside your body tag

<script type="text/javascript">
// Initialize the VSS sdk
VSS.init({
setupModuleLoader: true,
moduleLoaderConfig: {
paths: {
"Scripts": "scripts"
}
}
});

// Wait for the SDK to be initialized
VSS.ready(function () {
require(["Scripts/app"], function (app) { });
});
</script>
So at this stage your full index.html page will look like
<!DOCTYPE html>

<html lang="en">
<head>
<meta charset="utf-8" />
<title>TypeScript HTML App</title>
<link rel="stylesheet" href="app.css" type="text/css" />
<script src="sdk/Scripts/VSS.SDK.js"></script>
</head>
<body>
<script type="text/javascript">
// Initialize the VSS sdk
VSS.init({
setupModuleLoader: true,
moduleLoaderConfig: {
paths: {
"Scripts": "scripts"
}
}
});

// Wait for the SDK to be initialized
VSS.ready(function () {
require(["Scripts/app"], function (app) { });
});
</script>
<h1>TypeScript HTML App</h1>

<div id="content"></div>
</body>
</html>

Update App.ts

In your App.ts file remove the window.onload function and replace it with it's body so your App.ts file will look like below

class Greeter {
element: HTMLElement;
span: HTMLElement;
timerToken: number;

constructor(element: HTMLElement) {
this.element = element;
this.element.innerHTML += "The time is: ";
this.span = document.createElement('span');
this.element.appendChild(this.span);
this.span.innerText = new Date().toUTCString();
}

start() {
this.timerToken = setInterval(() => this.span.innerHTML = new Date().toUTCString(), 500);
}

stop() {
clearTimeout(this.timerToken);
}

}

var el = document.getElementById('content');
var greeter = new Greeter(el);
greeter.start();

Run App

Running your app with ctrl + F5 you will get a blank app that does nothing Smile

TypeScript_HTML_App_-_Internet_Explorer_2015-06-18_20-57-12

Changed the url to point to the SSL version of your site just to make sure everything is working

2015-06-18_20-59-24

Our App is now complete Open-mouthed smile

Install your extension

If you have signed up for the private preview you should see a tab in the admin section of your account called Extensions like so

2015-06-18_21-01-57

Click Install, and then Browse 

2015-06-18_21-06-49
browse for your extension.json file
Open_2015-06-18_21-07-59

Click Open and then OK

2015-06-18_21-11-56

Your extension is now installed

2015-06-18_21-17-24

View it on VSO

Go to a team project home page and you should now see a Time hub, click on it

image

Once you land here you will the time Smile

image

That's 1 extension in the bag but having this run on your local machine is probable not want you would want because nobody else can see it.

Publishing you app

You could buy an SSL certification but that costs a lot and most people don't have that kind of money laying around for fun apps and extensions so we'll turn to Azure. We will now right click on our project and click publish

2015-06-18_21-45-47

If you setup an Azure site already you can import the publish settings but I haven't so I'm going to click on Microsoft Azure Web Apps

Publish_Web_2015-06-18_21-46-16

and then click on New (again if you have a site already you can select it in this list)

Select_Existing_Web_App_2015-06-18_21-47-26

Select a name and click Create

Create_Web_App_on_Microsoft_Azure_2015-06-18_21-48-31

it will now take a small bit to setup your azure resource

Create_Web_App_on_Microsoft_Azure_2015-06-18_21-49-28

and then auto magically configure everything you need Smile, click Publish

Publish_Web_2015-06-18_21-49-58

After the publish is finish your site will launch

TypeScript_HTML_App_-_Internet_Explorer_2015-06-18_21-51-30

Something that you will notice is that this is http but and not https as we said earlier we require. So let's see what happens if we add a s in there Smile

TypeScript_HTML_App_-_Internet_Explorer_2015-06-18_21-53-07

Everything still works Open-mouthed smile.

Last bit of manifest changes

Now that we have a publicly accessible website running on https (for FREE) we can take that url and replace what we currently have in our manifest so it will now look like this

{
"namespace": "VSO-Time-Ticker",
"version": "0.0.2",
"name": "Time Ticker",
"description": "A simple extension for Visual Studio Online of a Time Ticker",
"provider": {
"name": "Gordon Beeming"
},
"baseUri": "https://vso-hello-world.azurewebsites.net/",
"icon": "https://vso-hello-world.azurewebsites.net/images/some-icon.png",
"contributions": {
"vss.web#hubs": [
{
"id": "time",
"name": "Time",
"groupId": "home",
"order": 22,
"uri": "index.html",
"usesSdk": true,
"fullPage": false
}
]
}
}
Re-install your extension

2015-06-18_21-56-38

and refresh your extension in VSO

image

You will notice now that it obviously still works Smile, if you close Visual Studio and it still works you know it working Smile and I suppose you can check fiddler for where it's reading the files from.

Links

For more info on VSO Extensions visit http://aka.ms/vsoextensions.

A pretty neat getting started post is also on that site at https://www.visualstudio.com/en-us/integrate/extensions/get-started/visual-studio.

Microsoft has a project out on GitHub as well that is quite advanced in the API's that it uses and can be found at https://github.com/Microsoft/vso-team-calendar.

If you want a light overview over everything then you can get their VSO Extension Samples out on GitHub as well using the link https://github.com/Microsoft/vso-extension-samples.

Complete Sample code for this post is also out on Github at https://github.com/Gordon-Beeming/VSO-Time-Ticker

2 Comments

In the new Azure Portal you create all your resources in Resource Groups, there is also as part of the Azure SDK's a module called AzureResourceManager  by default the module loaded for the Azure SDK is AzureServiceManagement. A blurb from one of the Azure documentation page reads

"The Azure and Azure Resource Manager modules are not designed to be used in the same Windows PowerShell session. To make it easy to switch between them, we have added a new cmdlet, Switch-AzureMode, to the Azure Profile module."

Azure Resource Manager Commands

To get a list of all commands that are available for the AzureResourceManager module you can run the command

Get-Command -Module AzureResourceManager | Get-Help | Format-Table Name, Synopsis

this will return

Name

Synopsis

----

--------

Add-AzureAccount

Adds the Azure account to Windows PowerShell

Add-AzureEnvironment

Creates an Azure environment

Clear-AzureProfile

Clears an Azure profile

Disable-AzureSqlDatabaseDirectAccess

Disables the option to directly access to an Azure Sql database (without auditing)

Disable-AzureSqlDatabaseServerDirectAccess

Disables direct access to all Azure Sql databases that use the audit policy of a Sql databa...

Enable-AzureSqlDatabaseDirectAccess

Enables the option to directly access to an Azure Sql database (with auditing)

Enable-AzureSqlDatabaseServerDirectAccess

Enables direct access to all Azure Sql databases that use the audit policy of a Sql databas...

Get-AzureAccount

Gets Azure accounts that are available to Azure PowerShell.

Get-AzureADGroup

Filters active directory groups.

Get-AzureADGroupMember

Get a group members.

Get-AzureADServicePrincipal

Filters active directory service principals.

Get-AzureADUser

Filters active directory users.

Get-AzureBatchAccount

 

Get-AzureBatchAccountKeys

 

Get-AzureDataFactory

Gets information about Data Factory.

Get-AzureDataFactoryGateway

Gets information about logical gateways in Data Factory.

Get-AzureDataFactoryHub

Gets information about hubs in Data Factory.

Get-AzureDataFactoryLinkedService

Gets information about linked services in Data Factory.

Get-AzureDataFactoryPipeline

Gets information about pipelines in Data Factory.

Get-AzureDataFactoryRun

Gets runs for a data slice of a table in Data Factory.

Get-AzureDataFactorySlice

Gets data slices for a table in Data Factory.

Get-AzureDataFactoryTable

Gets information about tables in Data Factory.

Get-AzureEnvironment

Gets Azure environments

Get-AzureLocation

Gets the resource types and the Azure data center locations that support them.

Get-AzurePublishSettingsFile

Downloads the publish settings file for an Azure subscription.

Get-AzureRedisCache

Gets details about a single cache or all caches in the specified resource group or all cach...

Get-AzureRedisCacheKey

Gets the accesskeys for the specified redis cache.

Get-AzureResource

Gets Azure resources

Get-AzureResourceGroup

Gets Azure resource groups

Get-AzureResourceGroupDeployment

Gets the deployments in a resource group.

Get-AzureResourceGroupGalleryTemplate

Gets resource group templates in the gallery

Get-AzureResourceGroupLog

Gets the deployment log for a resource group

Get-AzureRoleAssignment

Filters role assignments.

Get-AzureRoleDefinition

Filters role definitions.

Get-AzureSqlDatabaseAuditingPolicy

Gets an Azure Sql database's auditing policy.

Get-AzureSqlDatabaseServerAuditingPolicy

Gets an Azure Sql server's auditing policy.

Get-AzureSubscription

Gets Azure subscriptions in Azure account.

Get-AzureTag

Gets predefined Azure tags

Import-AzurePublishSettingsFile

Imports a publish settings file that lets you manage your Azure accounts in Windows PowerSh...

New-AzureBatchAccount

 

New-AzureBatchAccountKey

 

New-AzureDataFactory

Creates a data factory.

New-AzureDataFactoryEncryptValue

Encrypts sensitive data.

New-AzureDataFactoryGateway

Creates a gateway for Data Factory.

New-AzureDataFactoryGatewayKey

Creates a gateway key for Data Factory.

New-AzureDataFactoryHub

Creates a hub for Data Factory.

New-AzureDataFactoryLinkedService

Links a data store or a cloud service to Data Factory.

New-AzureDataFactoryPipeline

Creates a pipeline in Data Factory.

New-AzureDataFactoryTable

Creates a table in Data Factory.

New-AzureRedisCache

Creates a new redis cache.

New-AzureRedisCacheKey

Regenerates the access key of a redis cache.

New-AzureResource

Creates a new resource in a resource group

New-AzureResourceGroup

Creates an Azure resource group and its resources

New-AzureResourceGroupDeployment

Add an Azure deployment to a resource group.

New-AzureRoleAssignment

Create a role assignment to some principals at a given scope.

New-AzureTag

Creates a predefined Azure tag or adds values to an existing tag

Remove-AzureAccount

Deletes an Azure account from Windows PowerShell.

Remove-AzureBatchAccount

 

Remove-AzureDataFactory

Removes a data factory.

Remove-AzureDataFactoryGateway

Removes a gateway from Data Factory.

Remove-AzureDataFactoryHub

Removes a hub from Data Factory.

Remove-AzureDataFactoryLinkedService

Removes a linked service from Data Factory.

Remove-AzureDataFactoryPipeline

Removes a pipeline from Data Factory.

Remove-AzureDataFactoryTable

Removes a table from Data Factory.

Remove-AzureEnvironment

Deletes an Azure environment from Windows PowerShell

Remove-AzureRedisCache

Remove redis cache if exists.

Remove-AzureResource

Deletes a resource

Remove-AzureResourceGroup

Deletes a resource group.

Remove-AzureRoleAssignment

Removes a role assignment.

Remove-AzureSqlDatabaseAuditing

Disables an Azure Sql database's auditing.

Remove-AzureSqlDatabaseServerAuditing

Disables auditing of all the databases that rely on the auditing policy of the given databa...

Remove-AzureSubscription

Deletes an Azure subscription from Windows PowerShell.

Remove-AzureTag

Deletes predefined Azure tags or values

Resume-AzureDataFactoryPipeline

Resumes a suspended pipeline in Data Factory.

Save-AzureDataFactoryLog

Downloads log files from HDInsight processing.

Save-AzureResourceGroupGalleryTemplate

Saves a gallery template to a JSON file

Select-AzureSubscription

Changes the current and default Azure subscriptions

Set-AzureBatchAccount

 

Set-AzureDataFactoryGateway

Sets the description for a gateway in Data Factory.

Set-AzureDataFactoryPipelineActivePeriod

Configures the active period for data slices.

Set-AzureDataFactorySliceStatus

Sets the status of slices for a table in Data Factory.

Set-AzureEnvironment

Changes the properties of an Azure environment

Set-AzureRedisCache

Set redis cache updatable parameters.

Set-AzureResource

Changes the properties of an Azure resource.

Set-AzureResourceGroup

Changes the properties of a resource group

Set-AzureSqlDatabaseAuditingPolicy

Sets an Azure Sql database's auditing policy.

Set-AzureSqlDatabaseServerAuditingPolicy

Sets an Azure Sql database server's auditing policy.

Set-AzureSubscription

Creates or changes an Azure subscription

Stop-AzureResourceGroupDeployment

Cancels a resource group deployment

Suspend-AzureDataFactoryPipeline

Suspends a pipeline in Data Factory.

Switch-AzureMode

Switches between the Azure and Azure Resource Manager modules

Test-AzureResourceGroupTemplate

Detects errors in a resource group template or template parameters

Use-AzureSqlDatabaseServerAuditingPolicy

Marks an Azure Sql database as using its server's auditing policy.

Step 0

As a step 0 lets open up everything we need.

Azure SDK

So to get started you need to install the Azure SDK which you can get from the SDK downloads page. I am using Azure SDK 2.5 version for this post.

PowerShell ISE

Open the PowerShell ISE using Win + R and then %WINDIR%\system32\WindowsPowerShell\v1.0\powershell_ise.exe. You can also use the standard PowerShell window if you want.

Azure Management Portal

Open and sign in to the Azure Management Portal (https://manage.windowsazure.com/)

Azure Portal

Open and sign in to the Azure Portal (http://portal.azure.com/)

Step 1

Let's start by getting the Azure Management Portal pieces out the way. In the Azure Management Portal we will just be creating a new AAD user that we can use to automatically login through PowerShell. If you want to use a MSA just leave the credentials bit off in Step 3 and you will receive a prompt for credentials at which time you can use MSA or AAD credentials and you can now move to step 2. If you want to create the new user follow my other post Creating a new Azure Active Directory User to create a user for this demo.

Step 2

In the PowerShell ISE we will kick off by switching the Azure SDK to use the resource manager module.

Switch-AzureMode -Name AzureResourceManager

Step 3

After we have switched to the AzureResourceManager module we are able to use the commands that are part of it. Let's start off by adding our Azure Account we just created using the snippet below (I keep my username and password in txt files and reference from multiple sample scripts for ease of use but you can place them straight in the script if you wanted

[string]$currentUsername = Get-Content "Z:\_PowerShell\Azure\currentUser.txt"
[string]$currentPassword = Get-Content "Z:\_PowerShell\Azure\currentPass.txt"
$secpasswd = ConvertTo-SecureString $currentPassword -AsPlainText -Force
$mycreds = New-Object System.Management.Automation.PSCredential ($currentUsername, $secpasswd)
Add-AzureAccount -credential $mycreds

 

This would then return the account added

image

At this point if you ran Get-AzureAccount it will show you this account and any others you have previously added.

image

Step 4

Next we'll explore some meta data for Step 5.

Getting a Resource Group Template

We can run the command

Get-AzureResourceGroupGalleryTemplate

and it will return a list of all the current resources group templates that we can create. For the list at the time of writing this post you can refer to one of my GitHub Gists at https://binary-stuff.com/gist/ea0884f5ba00c62a83e4. We are going to be using one of the templates found around line 2220 which is the website and sql database.

image 

Get a list of Azure Resource Locations

When creating Azure Resources you need to specify were those resources are located geographically. Now you could guess or Bing what the locations are or you could use the handy command

Get-AzureLocation

which will return you a list of every location that every resource is available in. This list as with the one above can be found as one of my GitHub Gists at https://binary-stuff.com/gist/87aad8bfbbbcd7421b83. From the long list of locations we are just going to use West Europe for all our resources

Step 5

At this point we have all the info we need to create our resources using the resource manager in powershell so we'll use the snippet

$ResourceManagerTest = "ResourceManagerTest"
$AzureDataCenterLocation = "West Europe"
$administratorLoginPassword = ConvertTo-SecureString "$($ResourceManagerTest)DbP@ssw0rd" -AsPlainText -Force
New-AzureResourceGroup -Name "$($ResourceManagerTest)" `
-Location "$AzureDataCenterLocation" `
-GalleryTemplateIdentity Microsoft.WebSiteSQLDatabase.0.2.2-preview `
-siteName "$($ResourceManagerTest)Site" `
-hostingPlanName "$($ResourceManagerTest)Plan" `
-siteLocation "$AzureDataCenterLocation" `
-serverName "$($ResourceManagerTest.ToLowerInvariant())dbserver" `
-serverLocation "$AzureDataCenterLocation" `
-administratorLogin "$($ResourceManagerTest)DbLogin" `
-administratorLoginPassword $administratorLoginPassword `
-databaseName "$($ResourceManagerTest)DbName" `
-Verbose

This will then go off and create our website and database along with an Application Insights resource that we can use for our website when we deploy it. When the command finishes you should see an output similar to the one below which because we specified the -verbose flag tells us the status of each resource created

image

At this point we really are finished with what the subject of the blog post is but we'll continue on to explore what we have just created, how we would have had to create it using the Azure Portal and how we can remove the resource group using PowerShell.

Step 6

Open the Azure Portal. The first thing you should notice is that you already have a notification and when you open that it says that a deployment has succeeded.

image

Clicking on the success notification will open that resource group

image

From here you can use the resources 100% as if you created them in the portal.

Step 7

Before we see how we would have had to create those resources manually let's remove the resource group from our subscription. To do this we run the simple command below keeping in mind that the $ResourceManagerTest variable should still be set from the previous command in step 5

Remove-AzureResourceGroup -Name $ResourceManagerTest -Force -Verbose

this will then proceed to remove the resource group, again because of the -verbose flag we don't get as much info but rather just that it's deleting the resource group and then comes back when it's done.

image

Step 8

The last thing that I'll show on this post is how we would of have to do this manually (or at least where to find template). Back in the Azure Portal click new in the bottom left corner and then click on Everything

image

Next click on the Web category/section and then you'll see the Website + SQL option which is what we created

image

clicking on that option will give you a little info about the template and from here you'd just click on create

image

From here you will configure the Resource Group Name, all the settings for your Website resource and SQL resource and then click create.

image

When that process completes you will be in the sample place as that small PowerShell script gets you too Smile

Thoughts and comments

If you have any thoughts or comments about any of the pieces of this post please do share below.

0 Comments

If you've been using Application Insights for a while now you would have noticed that with recent Visual Updates the Application Insights SDK had been updated to use a new version that logs AI data into the new Azure Portal. From this if you use Windows Store Apps you would probably have noticed that if you instrument a Windows Store application with the new SDK that you can't actually find your data anywhere although you can see it being logged from the Visual Studio Output window and using Fiddler. The good news is that you will soon be able to have a great experience with the new AI SDK for Windows Store apps as you do for your currently for your web applications, the better news is that below you'll see how to use the new SDK for your Store Apps and see the data in the Azure Portal.

Create a new AI Resource

First off head over to the Azure Portal and create a new Application Insights Resource

image

Give it a name and choose a Resource Group, It's fine that the Application Type is set to ASP.net web application.

image

After the resource has been created you are ready to add AI to your store app in visual studio

Adding AI to your Windows Store App

Open your store app in visual studio and make sure you have the latest AI SDK installed (2.4 at the moment) and no other AI artifacts left in your project. Right click on the store app solution and click on Add Application Insights Telemetry...

image

At this point if you haven't signed in you will be promoted to sign in, click Gain insights into your application now button

image

After you have signed in you will be presented with the Add Application Insights to Project window

image

On this window click on Use advanced mode

image

which will now ask you for all your AI settings

image

We'll get all of these from the portal. Open your AI resource in the portal and click on Properties

image

This will show you your Subscription ID, Resource Group, Application Insights Resource (name) and Instrumentation Key. Copy these and place them in the window. Once you have filled in all of those fields and clicked Add Application Insights To Project you are ready to run your app and see some data. 

pastedImage

For now it's good enough to see what's happening in your app and some insight is infinity better than no insight =)

0 Comments

I use to make a lot of TFS customizations and had to apply the template changes to multiple team projects which took a bit of time. Depending on the method you use it could be a quick or loooong process Smile. When I first started doing customizations I used the TFS Power Tools to upload changes which is a lot of effort because you are uploading one work item definition at a time into one team project.

Using Command Line

After a while I started using command line (witadmin importwitd), this was slightly faster but I found myself keeping a list commands in a txt file and then searching for the one I need when needed and run it.

A Basic PowerShell Script

I follow Martin Hinshelwood on various social media and one day he posted a blog post titled Upgrading to Visual Studio Scrum 3.0 process template in TFS 2013, although I had been at this point playing a lot with upgrading from TFS 2012 to TFS 2013 there was one piece of magic in that post that changed the way I applied process template changes up until today. It was a script that simple looped through the work item definitions in a set folder and imported them into TFS

Param(
[string] $CollectionUrlParam = $(Read-Host -prompt "Collection (enter to pick):"),
[string] $TeamProjectName = $(Read-Host -prompt "Team Project:"),
[string] $ProcessTemplateRoot = $(Read-Host -prompt "Process Template Folder:")
)

$TeamProjectName = "teamswithareas"
$ProcessTemplateRoot = "C:\Users\mrhinsh\Desktop\TfsProcessTemplates\Microsoft Visual Studio Scrum 3.0 - Preview"
$CollectionUrl = "http://kraken:8080/tfs/tfs01"

$TFSConfig = "${env:ProgramFiles}\Microsoft Team Foundation Server 11.0\Tools\TFSConfig.exe"
$WitAdmin = "${env:ProgramFiles(x86)}\Microsoft Visual Studio 12.0\Common7\IDE\witadmin.exe"

witds = Get-ChildItem "$ProcessTemplateRoot\WorkItem TrackingType\Definitions"

foreach ($witd in $witds)
{
Write-Host "Importing $witd"
& $WitAdmin importwitd /collection:$CollectionUrl /p:$TeamProjectName /f:$($witd.FullName)
}
$WitAdmin importcategories /collection:$CollectionUrl /p:$TeamProjectName /f:"$ProcessTemplateRoot\WorkItem Tracking\Categories.xml"
$WitAdmin importprocessconfig /collection:$CollectionUrl /p:$TeamProjectName /f:"$ProcessTemplateRoot\WorkItem Tracking\Process\ProcessConfiguration.xml"

Small Script Evolution

This worked for a while but I still had keep a couple of PowerShell files for the different projects I want to import the process templates into. I ended up adding over the next while adding a couple of additions to the script like publishing new global lists

#if there is a file with the name GlobalLists-ForImport.xml import it as Global List info for the current collection
if (Test-Path "$ProcessTemplateRoot\GlobalLists-ForImport.xml")
{
Write-Host "Importing GlobalLists-ForImport.xml"
& $WitAdmin importgloballist /collection:$CollectionUrl /f:"$ProcessTemplateRoot\GlobalLists-ForImport.xml"
}

and imported linked types

#import each Link Type for the $CollectionName
foreach($witd_LinkType in $witd_LinkTypes)
{
Write-Host "Importing $($witd_LinkType.Name)"
& $WitAdmin importlinktype /collection:$CollectionUrl /f:$($witd_LinkType.FullName)
}

ALM Rangers - vsarUpgradeGuide & vsarSAFe

vsarUpgradeGuide

The first project I joined after joining the ALM Rangers was the TFS Upgrade Guide. The last part of my contributions for the upgrade guide was a PowerShell script that could help you easily upgrade your process templates (or at least publish them) after you have made the changes required to make them compatible with TFS 2013. And for some reason it wasn't until then that I made the script target multiple team projects in the same collection.

vsarSAFe

The latest small modifications that were made to the script were for the project vsarSAFe which looks at how to modify your process template to make them SAFe aware. If you aren't familiar with SAFe it stands for Scaled Agile Framework. As I'm writing this we are showing up as Delayed on the Flight Plan but will be landing soon Open-mouthed smile

 image_thumb[3]

Most of the changes included here were just around adding comments and cleaning the script up a bit to make it easier to read.

So what does the script look like?

The final script (as it is now) looks like below

# Copyright © Microsoft Corporation.  All Rights Reserved.
# This code released under the terms of the
# Microsoft Public License (MS-PL, http://opensource.org/licenses/ms-pl.html.)
#
#config
$server = "MyTfsServer"
$port = 8080
$virtualDirectory = "tfs"
$CollectionName = "DefaultCollection"
$TeamProjectNames = @("Team Project 1", "Team Project 2", "Team Project 7", "Sample Scrum Project 1")
$ProcessTemplateRoot = "C:\templates\Microsoft Visual Studio Scrum 2013.3"

$CollectionUrl = "http://$($server)$(if ($port -ne 80) { ":$port" })$(if (![string]::IsNullOrEmpty($virtualDirectory)) { "/$virtualDirectory" })/$($CollectionName)"
$API_Version = "12.0"

#----------------------------
# don't edit below this line
#----------------------------

#get a reference to the witadmin executable path for the current api version
$WitAdmin = "${env:ProgramFiles(x86)}\Microsoft Visual Studio $API_Version\Common7\IDE\witadmin.exe"

#if there is a file with the name GlobalLists-ForImport.xml import it as Global List info for the current collection
if (Test-Path "$ProcessTemplateRoot\GlobalLists-ForImport.xml")
{
Write-Host "Importing GlobalLists-ForImport.xml"
& $WitAdmin importgloballist /collection:$CollectionUrl /f:"$ProcessTemplateRoot\GlobalLists-ForImport.xml"
}

#get a reference to all work item type definitions
$wit_TypeDefinitions = Get-ChildItem "$ProcessTemplateRoot\WorkItem Tracking\TypeDefinitions\*.*" -include "*.xml"

#get a reference to all work item link types
$witd_LinkTypes = Get-ChildItem "$ProcessTemplateRoot\WorkItem Tracking\LinkTypes\*.*" -include "*.xml"

#import each Link Type for the $CollectionName
foreach($witd_LinkType in $witd_LinkTypes)
{
Write-Host "Importing $($witd_LinkType.Name)"
& $WitAdmin importlinktype /collection:$CollectionUrl /f:$($witd_LinkType.FullName)
}

foreach ($TeamProjectName in $TeamProjectNames)
{
Write-Host "Upgrading $TeamProjectName."

#import each Type Definition for the $TeamProjectName
foreach($wit_TypeDefinition in $wit_TypeDefinitions)
{
Write-Host "Importing $($wit_TypeDefinition.Name)"
& $WitAdmin importwitd /collection:$CollectionUrl /p:$TeamProjectName /f:$($wit_TypeDefinition.FullName)
}

#import work item categories for the $TeamProjectName
& $WitAdmin importcategories /collection:$CollectionUrl /p:$TeamProjectName /f:"$ProcessTemplateRoot\WorkItem Tracking\Categories.xml"

#import work item process configuration for the $TeamProjectName
& $WitAdmin importprocessconfig /collection:$CollectionUrl /p:$TeamProjectName /f:"$ProcessTemplateRoot\WorkItem Tracking\Process\ProcessConfiguration.xml"
}
Write-Host "Done upgrading team projects"

This script now targets unlimited team projects in 1 team project collection, updates the categories, configuration, global lists and link types. You can grab the script off GitHub as well under my Gists (upgrade-tfs-2013-process-templates.ps1).

This takes care of all the things I need when making process template changes as I now make what ever changes I need run the script and check my changes in the browser. It doesn't get much easier than this but if you have a easier way do let me know Smile.

0 Comments

So you have a project that uses Application Insights and want to share it with the world. Sure you just post it to CodePlex, GitHub or some other place that allows publishing of code don't you.

NO!!!

Something that you don't want to do is share your code without removing some of the values under the ComponentSettings node in your ApplicationInsights.config. I see 3 possible ways of doing this, possibly each being used in a specific scenario.

1.) Delete and publish

When: I foresee this scenario being for when you have not made any custom changes to your Application Insights configuration.

The first way is to simple delete the Application.config and publish the source code.

image

This will mean that when somebody that downloads your code wants to run it and use the Application Insights bits they will need to add new Application Insights configuration

image

After they have done that they can use your solution with Application Insights no problem

2.) Blank out or token the ComponentSettings section

When: I see this option being for when you have made modifications to the Application Insights config that you feel other users would need in order to effectively use App Insights in the application.

You could for example specify that for the Development Profile you don't want to collection user and machine names.

image

The pieces you will want to blank out or token are below

  • ComponentSettings\ComponentId
  • ComponentSettings\DevelopmentMode\ComponentId
  • ComponentSettings\DevelopmentMode\PortalURI
  • ComponentSettings\DevelopmentMode\DashboardId
  • ComponentSettings\AccountId
  • ComponentSettings\LicenseKey

image

After you have done this you are good to share your code Smile

3.) Split source control

When: You want to share the code but also want to use the application for "real", i.e.: Windows Store Application

What I have started doing for applications that I want to share the source for but also want to use the app in the real world is connecting to a public source control like CodePlex or GitHub and then also to private source control like VSO. The way I do it is to have a public GitHub repo that contains all the logic for my app, this repo is then pushed into a VSO repo as well where I do all my Application Insights stuff. Using this method allows me to work on my app normally as I would with source control and allows the community to be evolved as well with them having to worry about App Insights configuration (sometimes you would want them to be adding App Insights to the code as they add it in which case you will just keep the config separate) and then for when I publish to the store I don't have to have to add a bunch of App Insights code everywhere and keep it out of source control. All I need to do is push to my VSO account, merge any conflicts (which shouldn't happen if I'm just doing App Insights in VSO) and then possible add any additional telemetry that I want and then finally I publish to the store and everything is awesome Open-mouthed smile.

Conclusion

These are just my thoughts and there are probably other ways that people currently do it. One thing you don't want to do is share you keys because then you need to reset them and then update all applications in the wild connected to your account.

if you have other ways of handling this today give me a shout at @GordonBeeming with some details Smile