Things hard and not so hard.... RSS 2.0

Folks - my blog has moved to

# Friday, 04 April 2014



Looks pretty snazzy :)

Friday, 04 April 2014 12:02:25 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [0] -
.NET Developer | Azure | Dev | General
# Monday, 03 June 2013

The time had come to invoke my light weight DB engine – SQL 2012 Local Install which effectively is a 33MB install from sites like

(the sql management studio is 600MB – localDB engine is 33MB – cool)

So there I was in VSNET trying to create a new SQL ‘localdb’ database.

The classic ‘instance name’ is (localdb)\v11.0 which is similar to (local)\SQLExpress when using SQL Express.

Upon plugging this into VSNET, my Windows Event Log was suddenly fully of Red Stop signs and much chatter


After much attempting to Install Windows Updates and re-install SQL LocalDB the answer was quite simple really…

The Magic Juice – enter SqlLocalDB.exe

This guy is the admin tool of SQL LocalDB – so delving into the supported commands I could:

a) list all the known LocalDB instances

b) Delete an instance

c) Create an instance

So in short –

sqllocaldb delete “v11.0”

sqllocaldb create “v11.0”

too easy….. (2hrs I wont get back)



All working like a bought one now – beauty.

Monday, 03 June 2013 12:49:03 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [0] -
.NET Developer | Dev | .NET Framework 4.5 | Tips
# Monday, 29 April 2013

While looking into purchasing MSDN licenses for a client here’s what I found:

For the US:



Now when you change the drop down from US to Australia we get these prices (given that $AUD 1 = (approx) $USD 1


So for e.g. take a MSDN – VS.NET Test.

Aussie Dollar = $3,460   US= $2,170 which equates to $AUD 1 = $USD 0.627

this is what happens when living in a 3rd world country…. Smile - absolutely outrageous.

Monday, 29 April 2013 10:27:38 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [0] -
.NET Developer | Dev | General
# Monday, 11 February 2013

After wrestling with this tonight for sometime I’ve finally cracked it. SP2013 RTMed and alot of the sample code fails due to the fact that you need to now add ‘..;odata=verbose’ onto pretty much every call to SharePoint.

Basically you get a series of errors such as:

MIME type could not be found that matches the content type of the response. None of the supported type(s) 'application/atom+xml;type=entry, application/atom+xml, application/json;odata=verbose


Previously alot of the sample code has

$.getJSON(….) as part of the call to the server – as mentioned we now need to add some custom header values of ‘odata=verbose’, so to save you hours of slogging on this, the getJSON call doesn’t allow custom header values. You need to use the $.ajax(…) for these calls.


function getCustomers() {

  // begin work to call across network
  var requestUri = _spPageContextInfo.webAbsoluteUrl +
                "/_api/Web/Lists/getByTitle('CustomersREST')/items/" +
                "?$select=Id,FirstName,Title,WorkPhone" +
  var requestHeaders = {
      "accept": "application/json;odata=verbose"
    // execute AJAX request
      url: requestUri,
      type: 'GET',
      dataType: 'json',
      headers: requestHeaders,
      success: onDataReturned,
      error: onError



//Sample code to update a Customer List Item in a Customer List called ‘CustomersREST’

function updateCustomer(dialogResult, returnValue) {

  if (dialogResult == SP.UI.DialogResult.OK) {
    var Id = returnValue.Id;
    var FirstName = returnValue.FirstName;
    var LastName = returnValue.LastName;
    var WorkPhone = returnValue.WorkPhone;
    var etag = returnValue.etag;

    var requestUri = _spPageContextInfo.webAbsoluteUrl +
              "/_api/Web/Lists/getByTitle('CustomersREST')/items(" + Id + ")";

    var customerData = {
      __metadata: { "type": "SP.Data.CustomersRESTListItem" },
      Title: LastName,
      FirstName: FirstName,
      WorkPhone: WorkPhone

    requestBody = JSON.stringify(customerData);

    var requestHeaders = {
        "accept": "application/json;odata=verbose",
        "X-RequestDigest": $("#__REQUESTDIGEST").val(),
        "X-HTTP-Method": "MERGE",
        "content-length": requestBody.length,
        "content-type" : "application/json;odata=verbose",
        "If-Match": etag

      url: requestUri,
      type: "POST",
      contentType: "application/json;odata=verbose",
      headers: requestHeaders,
      data: requestBody,
      success: onSuccess,
      error: onError



Monday, 11 February 2013 22:47:06 (AUS Eastern Daylight Time, UTC+11:00)  #    Comments [0] -
.NET Developer | SharePoint | 2013 | Tips
# Thursday, 07 February 2013

While setting up a new SharePoint solution one of the requirements was to embed a Yammer Feed.

So I followed the help document only to have an unusual situation:

1) the feed wouldnt render in browse mode.

2) the feed WOULD render in Edit Mode.

The key to the situation was to declare a little more on the sample script Yammer gave – adding ‘text/javascript’

Here’s the working script – (one for the bat –utility belt)

*tested from Win8, Firefox *

<script data-app-id='hyB2pTvrL36Y50py8EWj6A' type='text/javascript' src=''></script>
<script type='text/javascript'>
{ container: '#embedded-feed'
, network: 'yournetworkhere'
<div id='embedded-feed'></div>

Thursday, 07 February 2013 23:41:39 (AUS Eastern Daylight Time, UTC+11:00)  #    Comments [0] -
.NET Developer | 2013 | Tips
# Thursday, 30 August 2012

I recently ran into an interesting one while building some InfoPath forms for SP2010/2013 forms services.

I wanted to return some Rich Text (XHTML) fields back from a WCF WebService call.

I was at the point as a developer, where I couldn’t even say ‘Works on my machine…’.

The problem was – no matter what I tried, I would always have *plain text* and no ‘richness’ of the Rich Text. Didn’t work for me.

So I have:

1) a basic WCF Web service – running on my dev environment.

2) an InfoPath Form that makes the call and displays the results.

The WCF Service:


This is the field that I eventually want to return as RichText to InfoPath.

Here’s the Service Method code (which basically goes into a file and returns back a list of clauses) – just focus on the CONTENT = …GetXHTMLRichText(…)



InfoPath and Returning a RichTextField
2 things need to happen for this to work.

1. When InfoPath adds the WCF Service to the form, it needs to ‘detect’ the field correctly when it build the underlying schema.


You need (nb – ‘Content’ is my field name):
<xs:element minOccurs="0" name="Content" nillable="true">
                <xs:complexType mixed='true'>
                        <xs:any minOccurs="0" processContents="lax" maxOccurs="unbounded" namespace=""></xs:any>


Note the namespace on the ANY element above – this is the winner to tell InfoPath that this is a richtext field.

2. When returning data via this field (in my case the ‘Content’ field), it needs to be in a certain shape, as in:
<Content xmlns=http://yournamespace>
    <span xmlns="">Rich text here</div>

Your rich text content needs to be ‘wrapped’ for InfoPath to play nicely with it.

This was the purpose of my GetXMLRichText method as



The gotcha:

When I pointed InfoPath at my webservice and added a service reference I was getting back a SimpleType for the field and not a ComplexType/Rich Text field.

The WCF Service WSDL was ‘almost there’ but not close enough:

The Content field described in a ComplexType which is almost there, but not quite.

It’s missing the <xs:complexType mixed=’true’>…<xs:any namespace=’’ …/>. The rest were good.

The fix:

Cutting a long story short, the simplest way forward here was to simply edit the form components that InfoPath had built and correct the schema. Then reuse the form.

The form looks like this:



From the File->Publish->Export Source Files you can get to the source and edit the correct schema (XSD) file.

Close the form down in InfoPath (or you may even need to close InfoPath) to edit the Schema.


You may need to hunt through a few of them to find the right one. My file was GetKCCTerms12.xsd

Modify, save and close that file.

Right click on manifest.xsf –> Design to launch InfoPath and then select Save As to work with it as *.XSN form (*.xsn files are just CABs with all these files inside)


The final result as viewed from an InfoPath form – notice the bolding sent through.



Thursday, 30 August 2012 21:25:47 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [0] -
.NET Developer | Dev | SharePoint | 2010 | Tips
# Wednesday, 13 June 2012

Hi folks, you've probably heard a fair bit about the make over of Azure into 'Azure 2.0' (the SDK is still 1.7)

There's some great new tools within VS.NET to manage your environment better, even a Service Bus 'explorer' which was much needed.

I've collected a few links to start with for you guys to read up on when you've got a moment:

Azure 2.0 Details on:

Wednesday, 13 June 2012 11:54:11 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [0] -
.NET Developer | Azure | Integration | ServiceBus | Events | Recordings | Tips | Training
# Friday, 08 June 2012

Hi guys, I gave an online presentation earlier this afternoon as part of Microsoft Readiness on Azure Virtual Networks.

I had the whole presentation prepared until the announcement, where I had to go to the drawing board and just share all this goodness that was pouring out in Azure V2.0.

Thanks to the healthy turnout for those online and to those who registered, then check the emails for a link shortly.

As promised here’s the slide deck guys that I used through my demos

Connecting Cloud and On-Premises Applications Using Windows Azure Virtual Network - Breeze_Mick Badran


Friday, 08 June 2012 21:24:57 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [0] -
.NET Developer | Azure | Integration | ServiceBus | BizTalk | Insights | Events | Recordings | Readiness | Training

Now we’re talking….


Off to do some damage…umm play.

Friday, 08 June 2012 07:34:26 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [0] -
.NET Developer | Azure | Integration
# Thursday, 07 June 2012

I thought I’d give it a go and see how far this would take me. (One of the Win8 x64 beta installs caused alot of grief)

So my environment:

  • Recent Win8 RC x64
  • Office 2010 x64
  • Outlook 2010 x64

Installed the CRM 2011 Client by going to the web address of our crm site e.g.

There’s a button on page that says ‘download Dynamics CRM for Outlook’ – after a short download and install all went well

I grabbed CRM2011 CU8 - and updated accordingly.

Note at this point CRM plugin had not been configured.

CRM Client Log files prove very helpful here:

When I fired up Outlook and went to configure the CRM Plugin, Testing connection I would get back

“…we can’t authenticate your credentials…

Digging into the log files…..


22:05:18|  Error| Error connecting to URL: Exception: System.IO.FileNotFoundException: Could not load file or assembly 'Microsoft.IdentityModel, Version=, Culture=neutral, PublicKeyToken=31bf3856ad364e35' or one of its dependencies. The system cannot find the file specified.
File name: 'Microsoft.IdentityModel, Version=, Culture=neutral, PublicKeyToken=31bf3856ad364e35'
   at Microsoft.Xrm.Sdk.Client.ServiceConfiguration`1.CreateLocalChannelFactory()
   at Microsoft.Xrm.Sdk.Client.ServiceConfiguration`1.CreateChannelFactory(ClientCredentials clientCredentials)
   at Microsoft.Xrm.Sdk.Client.DiscoveryServiceConfiguration.CreateChannelFactory(ClientCredentials clientCredentials)
   at Microsoft.Xrm.Sdk.Client.ServiceProxy`1.get_ChannelFactory()
   at Microsoft.Xrm.Sdk.Client.ServiceProxy`1.CreateNewServiceChannel()
   at Microsoft.Xrm.Sdk.Client.ServiceContextInitializer`1.Initialize(ServiceProxy`1 proxy)
   at Microsoft.Xrm.Sdk.Client.DiscoveryServiceProxy.Execute(DiscoveryRequest request)
   at Microsoft.Crm.Application.Outlook.Config.DeploymentsInfo.DeploymentInfo.LoadOrganizations(AuthUIMode uiMode, Form parentWindow, Credential credentials)
   at Microsoft.Crm.Application.Outlook.Config.DeploymentsInfo.InternalLoadOrganizations(OrganizationDetailCollection orgs, AuthUIMode uiMode, Form parentWindow)

Solution: Install the Windows Identity Framework 3.5 that comes with Win8 RC.


And in my case, you’re done.

Happy CRM-ing.

Now to fill in my timesheets Winking smile


Thursday, 07 June 2012 22:27:43 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [0] -
.NET Developer | CRM
# Monday, 04 June 2012

I think at the moment, the short answer is no! Off to check the documentation…

”To continue with this installation or upgrade…please remove!…”



From the documentation – we uninstall TFS 2010 bits, but not the DB obviously and you do….

Step 3 Use Control Panel to completely uninstall the previous version of Team Foundation Server. If SharePoint Products is running on a computer other than Team Foundation Server, you have to uninstall the TFS Extensions for SharePoint from the SharePoint server, too. If SharePoint Products is on the TFS application tier, don’t worry: We’ll automatically uninstall the TFS Extensions for SharePoint while we remove the old version of TFS.

Uninstall previous version

Step 4 Run the Team Foundation Server install from the product DVD and then use the Upgrade Configuration wizard to upgrade your installation. But wait—if SharePoint Products is running on a computer other than the computer running Team Foundation Server, you’ll first want to install the new TFS Extensions for SharePoint on the SharePoint server. Similar to the previous step, if SharePoint Products is on the TFS application tier, we’ll automatically install the Extensions for SharePoint while we set up the new version of TFS.

Select upgrade

Monday, 04 June 2012 14:57:24 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [0] -
.NET Developer | General | TFS
# Tuesday, 29 May 2012
Finally we get some info on what IP ranges are used for the Data Centers. Now when you have those conversations with the Network security folks and when they ask "What IP addresses are you hitting?", when they want to open up access for Azure Service Bus.

Here's the 'official' IP Ranges (you just hope it doesn't change on works for 3 days of the week, then the 4th it stops...that was an interesting one to solve)

Windows Azure DataCenter IP Ranges

This appeases my grief in a previous post

Tuesday, 29 May 2012 08:25:50 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [0] -
.NET Developer | Azure | ServiceBus
# Monday, 28 May 2012

Hi folks,

We’re looking for someone who loves technology and is currently surrounded by .NET technologies.
Contact us if you want to be part of a great team that sinks their teeth into many different projects, concepts and technologies.

The most important thing I look for is your ‘can do’ attitude, the rest we can essentially learn. Come and be part of a team that loves what they do, and do what they love.
(Makes it easier to get up on these colder mornings Winking smile)

Sydney based.

If you’re interested and want to start the ball rolling -
(I might even get you to leave your CV at home for the interviews Winking smile)

Talk to you soon,


p.s. No agencies please.

Monday, 28 May 2012 18:38:48 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [0] -
.NET Developer | Breeze | Jobs
# Tuesday, 22 May 2012
Hi folks, recently I've been asked by several students on how to create Parties, Agreements, Profiles etc. via code in BizTalk 2010.

I played with this along time ago while at Redmond as BizTalk 2010 was in the process of being released.

So I've just rolled up my sleeves and provided a quick demo for you - the demo shows:
  1. How to enumerate and get at each of your TPM Partners.
  2. How to create Partners + Profiles within BizTalk 2010.
Note: I've only tried this on BizTalk 2010 (& needless to say I'm claiming 'works on my machine' :))

What we're talking about in BizTalk
This section here....

Show me the code....
Well the magic is found in this DLL -
C:\Program Files (x86)\Microsoft BizTalk Server 2010\Developer Tools\Microsoft.BizTalk.B2B.PartnerManagement.dll

- create a VS.NET 2010 app (for this demo I created a console app)
- we make a reference to the above DLL (we also need to reference
- set a connectionstring to our BizTalk Management DB, mine is BizTalkDB (as I rolled all the BizTalk DBs into one - for dev)
- start enumerating.

C# Looks like this-

static void Main(string[] args)
           //enumerate all the TPM Profiles in BizTalk
           var builder = 
               new SqlConnectionStringBuilder("DATA SOURCE=localhost;Initial Catalog=BizTalkDB;"
           + "Integrated Security=SSPI;MultipleActiveResultSets=True");
           var tmpCtx = TpmContext.Create(builder);

           Console.WriteLine("Connected to BizTalk Global Parties");
           var partners = tmpCtx.Partners;
           Console.WriteLine("Number of Parters:{0}", partners.Count());

           foreach (var ptr in partners)
               var profiles = ptr.GetBusinessProfiles();
               Console.WriteLine("{0} Business Profiles:{1}", ptr.Name, profiles.Count);
               foreach (var profile in profiles)
                   Console.WriteLine("\tProfile:{0}", profile.Name);
           if (bCreateProfile)
               createProfile("Breeze Partner #");

Point to Note: in the connection string I set 'MARS=true' just so we can enumerate several collections at once through the one context. When updating or saving new, partners and/or profiles I get errors and can't save through a MARs enabled connection. (love to hear if you have different luck)

Creating a Partner + Profile
// need to do this through a single threaded connection - no MARS
        private static void createProfile(string partnerName)
            partnerName += DateTime.Now.ToString("yyyyMMdd-hhmmss") + (new Random().Next(0, 65535));
            Console.WriteLine("Writing a new Profile for {0}", partnerName);

            var builder = new SqlConnectionStringBuilder("DATA SOURCE=localhost;Initial Catalog=BizTalkDB;Integrated Security=SSPI");
            var tmpCtx = TpmContext.Create(builder);
            var ptr = tmpCtx.CreatePartner(partnerName);
            var pname = "Breeze Profile-#" + DateTime.Now.ToString("yyyyMMdd-hhmmss") + (new Random().Next(0, 65535));
            var bp = ptr.CreateBusinessProfile(pname);
            bp.Description = "Created from Code";
            var pcol = new AS2ProtocolSettings("BreezeProtocolSettings");
And that's pretty much all there is to it folks, have a play around with the APIs for yourself - all undocumented of course.

Here's the Console App Solution I use (built for very demo purposes)


Enjoy Mick!
Tuesday, 22 May 2012 14:45:23 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [0] -
.NET Developer | BizTalk | 2010 | 2010 R2 | Insights | Tips
# Sunday, 01 April 2012

Local MS Developer pillar Andrew Coates spilled the beans on this next new language to come out of MS Research.

Db.NET or ‘D flat’ – F#, C# and the Cinderella of the 3 sisters ‘VB.NET’
(Last year I was introduced to F# over a 5 month project and absolutely loved the simplicity and freshness of it – async was simple, tasks, functions and code that would normally take 400 lines in C#, we were able to do in 100 in F#)

It promises:

- speed

- optimisation (I wonder if it’ll be smart enough to run tasks on different CPU cores?)


There is a focus on Orchestration – data Orchestration found here

Where it talks about “An example of the close collaboration between the product team and the company’s research arm is the use of Schenkerian Analysis in the compiler to maximize orchestration between sections of the code.”

Oooh I thought – let’s check out what this is Schenkerian Analysis and a quick check of Wikipedia reveals

Schenkerian analysis is a method of musical analysis of tonal music based on the theories of Heinrich Schenker. The goal of a Schenkerian analysis is to interpret the underlying structure of a tonal work. The theory's basic tenets can be viewed as a way of defining tonality in music. A Schenkerian analysis of a passage of music shows hierarchical relationships among its pitches, and draws conclusions about the structure of the passage from this hierarchy. The analysis is demonstrated through reductions of the music, using a specialized symbolic form of musical notation that Schenker devised to demonstrate various prolongational techniques. The concept of tonal prolongation, in which certain pitches determine the goal of other, subordinate pitches, is a cornerstone of the pitch hierarchy that Schenkerian analysis involves itself with.”

So tones, pitches and music is where this algorithm has its roots…I can see how you could take this analysis when applied to the frequency of music and apply it to the frequency of code items; data being hit etc.

I’ll crack open this VS.2011 extension and see what transpires…

Grab the TOOLS here -

Sunday, 01 April 2012 20:10:48 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [0] -
.NET Developer | Async
# Saturday, 10 March 2012

Easy but effective

<script type='text/javascript'>
var msg = "your big title goes here…";
msg = " ..... " + msg;pos = 0;
function scrollTitle() {
document.title = msg.substring(pos, msg.length) + msg.substring(0, pos); pos++;
if (pos > msg.length) pos = 0

Saturday, 10 March 2012 15:12:07 (AUS Eastern Daylight Time, UTC+11:00)  #    Comments [0] -
.NET Developer | Dev | Tips
# Thursday, 02 February 2012

With the ever changing Azure space, chances are you’ve had services working a treat and then one day just fail.

“Can’t connect…" etc.

This has happened to me twice this week – with over 14 IP Address ranges defined in the client’s firewall rules.

It appears that my service bus services were spun up or assigned another IP outside the ‘allowed range’.

It gets frustrating at times as generally the process goes as follows:

1) fill out a form to request firewall changes. Include as much detail as possible.

2) hand to the client and they delegate to their security/ops team to implement.

3) confirmation comes back.

4) start up ServiceBus service

5) could work?? may fail – due to *another* IP address allocated in Windows Azure not on the ‘allowed list of ranges’.

6) fill out another form asking for another IP Address…

By the 3rd iteration of this process it all is beginning to look very unprofessional. (in comparison, these guys are used to tasks such as ‘Access to SQL Server XXX – here’s the ports, there’s the machine and done’. Azure on the other hand – ‘What IP Addresses do you need? What ports?’… we need better information in this area)

Anyway – here’s the most update to date list 10/02/2011.

Thursday, 02 February 2012 13:15:07 (AUS Eastern Daylight Time, UTC+11:00)  #    Comments [0] -
.NET Developer | Azure | Integration | ServiceBus
# Thursday, 26 January 2012

Recently there’s been an update to the ‘on-premise’ AppFabric for Windows Server.

Grab the update here - (runs on win7, 2008, 2008R2)

What’s new

I’m in the process of updating my components, but the majority of updates seems to be around caching and performance.



This allows a backend provider to be used on the cache servers to assist with retrieving and storing data to a backend, such as a database. Read-through enables the cache to "read-through" to a backend in the context of a Get request. Write-behind enables updates to cached data to be saved asynchronously to the backend. For more information, see Creating a Read-Through / Write-Behind Provider (AppFabric 1.1 Caching).

Graceful Shutdown

This is useful for moving data from a single cache hosts to rest of the servers in the cache cluster before shutting down the cache host for maintenance. This helps to prevent unexpected loss of cached data in a running cache cluster. This can be accomplished with the Graceful parameter of the Stop-CacheHost Windows PowerShell command.

Domain Accounts

In addition to running the AppFabric Caching Service with the NETWORK SERVICE account, you can now run the service as a domain account. For more information, see Change the Caching Service Account (AppFabric 1.1 Caching).

New ASP.NET Session State and Output Caching Provider

New ASP.NET session state and output caching providers are available. The new session state provider has support for the lazy-loading of individual session state items using AppFabric Caching as a backing store. This makes sites that have a mix of small and large session state data more efficient, because pages that don't need large session state items won't incur the cost of sending this data over the network. For more information, see Using the ASP.NET 4 Caching Providers for AppFabric 1.1.


You can now enable compression for cache clients. For more information, see Application Configuration Settings (AppFabric 1.1 Caching).

Multiple Cache Client Application Configuration Sections

A new dataCacheClients section is available that allows you to specify multiple named dataCacheClient sections in an application configuration file. You can then programmatically specify which group of cache client settings to use at runtime. For more information, see Application Configuration Settings (AppFabric 1.1 Caching).

Thursday, 26 January 2012 10:14:06 (AUS Eastern Daylight Time, UTC+11:00)  #    Comments [0] -
.NET Developer | AppFabricServer | Azure | Integration | 2010 | 2010 R2 | Dev
# Monday, 23 January 2012

Hi folks, welcome to Monday…so I thought.

Here I was registering a message inspector which should take 5 mins tops.

Find the right config, make sure the .NET full assembly name is cool and away we go.

I wanted to use this guy from my custom WCF Adapter within BizTalk – so I needed my new message inspector to be seen by BizTalk.

So I used:

<add name="wcfMsgPropPromoter" type="Breeze.WCF.Extensions.BreezeMessagePromoteBehaviour,Breeze.WCF.Extensions,Version=,Culture=neutral,PublicKeyToken=c2c8c7e827e9dd6a"/>

and added this guy to the <behaviorExtensions> element in the Machine.Config for .NET 4.0 x64/.NET 4.0 (& .NET 2.0 for good measure)

As if a scene from SpongeBob,… 3 hours later….

I had triple check GACs, caches, full assembly names etc…Scotty popped his head around and said “Oh yeah I had this one ages ago you need to use this…”

<add name="wcfMsgPropPromoter" type="Breeze.WCF.Extensions.BreezeMessagePromoteBehaviour, Breeze.WCF.Extensions, Version=, Culture=neutral, PublicKeyToken=c2c8c7e827e9dd6a"/>

Can you spot the difference?


Interestingly enough – this work is part of a .NET plugin I wrote for IIS 7.5 and to register the plugin you use “Breeze.WCF.Extensions.BreezeMessagePromoteBehaviour,Breeze.WCF.Extensions,Version=,Culture=neutral,PublicKeyToken=c2c8c7e827e9dd6a"


My head hurts for a Monday…

Hopefully you reclaim the hours I’ve lost here.


Monday, 23 January 2012 16:52:20 (AUS Eastern Daylight Time, UTC+11:00)  #    Comments [0] -
.NET Developer | 2010 | 2010 R2 | Dev | .NET Framework 4.5
# Monday, 12 December 2011

Hi folks, as you may/may not have been aware these are the core corner stone technologies of the MS Integration Stack.

The teams have been busily plugging away and coming up with the new versions – 4.5 corresponding to .NET 4.5 framework.

Here’s some links that describe what’s new from MS Santa & his elves:

  1. What's New in Windows Communication Foundation 4.5
    1. New Items I found of note are:
      • New Service Transport Default values – keep an eye on these.
      • Improvements from VS.NET 2011 – validation , better intellisence support.
      • Streaming improved – true async (yay!)
      • WebSocket support – through NetHttp(s)Binding
      • Single WSDL file generation with ‘?singleWSDL’ (which is pretty handy)
      • Self hosted + II hosted allow you to get to ServiceHost from code for dynamic configuration.
      • Binary Encoder supports compression!! – this is generally gzip compression.
      • My personal favourite – UDP support
  2. What's New in Windows Workflow Foundation in .NET 4.5
    1. New Items of note are:
      • New Activites – NoPersistScope (possible previously but we needed to write code)
      • WF Designer improvements – several here, but the ‘Outline view’ looks to be easier to work with.
      • C# Expressions – where’s the F# ones Sad smile ??
      • Designer Annotations – add your own comments to keep control of the jungle that is built.
      • WF Versioning – use WorkflowIdentity & DefinitionIdentity to define the version. WorkflowServiceHost supports multiple versions of the same WF. All pretty cool.
      • WF Designers can still be rehosted – I’ve used that many a place.
      • Contract First Development – ticks the boxes.
    2. WF Rules – still didn’t make the cut. There is a sample for WF4 using a custom Activity calling back to WF 3.5 Policy4 it’s called. It uses ‘interop’ back to WF3.5 and is found here -
      1. Will have to check out perf in this new land on these rules.
  3. Async CTP – while this didn’t make the ‘whats new’ list, it certainly does deserve a mention here.
    Over the last year I’ve built some pretty serious F# projects, and F# has the async support through and through the language. After over coming the challenge of learning it, the Async functionality is absolutely brilliant!!! F# does a great job in being able to turn a non-async chunk of code/method/class into an async one with by using the keyword async and a !. It’s straight forward from that aspect.

    It’s great to see the C# & VB.NETs being able to use the same fundamentals (albeit not as slick IMO Winking smile). – see a previous POST -

    As developers we sit here and say – what do I need this for? My code runs fine as it….and yes for the most part of what we do on our machine it does. This technology really comes into it’s own when you want consistent throughput from a solution with 1 person or 10000 concurrent people using it. That’s the difference.

    To use it:
    1. Get VSNET 2011 (as it requires a new compiler)
    2. Use ASYNC CTP (refresh3) with VSNET2010 SP1
  4. Check it out from here -
Monday, 12 December 2011 12:00:52 (AUS Eastern Daylight Time, UTC+11:00)  #    Comments [0] -
.NET Developer | Async | BizTalk | Dev | .NET Framework 4.5
# Wednesday, 05 October 2011

Hi folks, from a previous set of posts, we’ve been running a series of Azure Training Sessions.

Here’s the online links to the recordings that many of you have asked me about. Enjoy.

The links below should take you to the landing page, from the click on the View Online button.




Wednesday, 05 October 2011 11:09:03 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [0] -
.NET Developer | Azure | BizTalk | Events | Recordings | Readiness
# Tuesday, 26 April 2011

Hi folks, I thought I’d share something that captivated me on this rainy Easter day and that was

Visual Studio Asynchronous Programming -
(you’ll need VS2010 + SP1 before you grab the CTP)
There’s a new improved compiler + an extended library for us.

Hands up who’s done async programming in either VB.NET or C#??? It’s a pain! Thread management, Main UI threads can only update certain objects, passing values between main + background threads, determining whether a thread has completed its tasks… and so on…

Basically all these ‘issues’ keep us from delving further into the world of asynchronous programming cause it very rapidly becomes complex just managing the two worlds – sync + async.

Today I was pleasantly surprised!!!

About a year ago I saw a great presentation on F# and I was amazed at how if they wanted to run a bit of code async it was simple an extra character as in:

set results = …..   <-sync

set results! = ….  <- run this async

(don’t quote me on the above, but it’s something like that – let’s call it pseudo code)

Why are we interested in this? – that’s always the first question to ask when investigating. Too many times we here ‘this is really cool’ and ‘check this cool software out’ etc… but the real reason of WHY do we want to go down this road is never answered.

On a ‘developers machine’ looking at 5 items, running a single test client – you’d have to say “works on my machine” and you’d have no need to async anything. True. Let’s move beyond our beloved developer box and think about UAT/PROD environments and what your code is doing.

What happens if 4 concurrent requests come along – how is your code going to perform? (As developers we’d be thinking …’it’s in the hands of IIS, not my issue’ :) )
(I recently was presented with a solution that ran across 20 odd servers, the answer to everything was get more hardware to make the app more performant, scalable etc – couldnt be the code.)

So as the requests start to build (don’t know an exact number but let’s say 100/sec), what is happening to your code? how often do we sit down with profiling tools on our code in this space? must be the disks..slow…and as always we have definitive proof works on my machine says the developer!

It’s not until we see our code running under load that we get an appreciation for where things could be improved and are causing grief for not only IIS but other systems as well.

Scalability, performance and scalability – single threaded app/service vs multi-threaded. Multi-threaded tend to win all the time.

Let me give you a couple of suggestions where this stuff is great:

  1. As part of a WF/WCF/Class where you want to ‘push’ some processing into the background – critical things can be done upfront, and you can push some of the ‘other stuff’ into the background.
  2. Take advantage of some of the great multi-core/multi-cpu Servers out there – single threaded tend to run on the same core on the same CPU (known as thread affinity)

Anyway enough jabbering from me and let’s see some of the hidden gems…

Async Programming Framework

Let me show you a couple of examples (from my set):

1. Fetching a webpage


Here I go off to twitter and search for all the BizTalk items.

Couple of things to notice
- …Async is added to the end of routines for convention, indicating that these are Async callable routines.
- not a single IAsyncResult to be seen, no StateObject and no Callback routines!
– line 104 the async keyword indicating that this routine itself can be called async if desired (more for the compiler)
- line 108 the await keyword is used in the Async framework to ‘wait for the async task to complete’  then move onto the next line.
- line 108 WebRequest.Create(…).GetResponseAsync – it’s the GetResponseAsync that is the async method, no …Begin or ..OnEnd calls! Just write it as you read it.
- line 109 We get a reference to the response stream (I should check for the existence of data etc – demo code, demo code :))
- line 112 …await stm.ReadAsync(…) – reads the response stream into a buffer on a background thread and we wait there until this completes (await keyword). By all means there’s many other ways to program this, as in we don’t need to wait, we could run this guy in the background quite happy and then check on him periodically.

That’s it! Not too tough at all, multi-threaded goodness right there. You can have blocking and non-blocking calls etc.

2. What about a Chunk of CPU based code

NO Async Example – as per normal, doing some cpu things.


Written in Async….


Points to notice:
- line 63 async Task<int[]> … to the Async framework the async methods are wrapped within a Task class. We must ‘wrap’ anything we return from our routines within a Task<..> – here I’m returning an int[]
-line 66 … = TaskEx.Run(…something to run in a background thread…). As we’re dealing with a block of code, there’s a Task Extension class that allows us to run that bit of code Async.
-line 79 await matrix – this line ensures that our async routine has indeed completed (or errored) before we move onto the next line.

Too easy if you’ve lived in the other world.

As always remember this is CTP so I wouldn’t go rolling out into Prod just yet. The perf numbers I get are pretty much identical to rolling all of this by hand with ThreadPool.QueueWorkItem(…) and IAsyncResult etc.

Well done MS!

Enjoy and here’s my VS.NET Sample Solutions – I had great fun! Oh – this is also applicable to Silverlight + WP7 apps :)

Tuesday, 26 April 2011 23:33:52 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [0] -
.NET Developer | Async | Silverlight | TechTalk | Tips
# Monday, 20 September 2010

The scene looks like this – while teaching a SharePoint 2010 class I decided to build a Feature that used one of the OOTB Service Applications of SharePoint 2010.


I decided to create a PDF Converter ‘Feature’ that used the Word Automation Services hosted in SharePoint 2010. Looking into the Word Automation Services, I’d say that if you’ve already got a PDF creation process going, then stick with it as it appears this service is pretty simple. However if you’ve got nothing, then Word Automation Services will be great!
(Having spent a previous life in a graphics company, there are many options that go with creating just that perfect PDF…I could find none of these here :()

(yes the pdf icon needs work…)

So I created a VS.NET 2010 Solution with a Feature.

The PDFConverter.cs is where the crux of the work is – the rest of the solution is working out just the right spot to call it.

Couple of Interesting Points about the Solution

1. ScriptLink – using this from a CustomAction within an Elements file allows to inject Script into a site where the Feature is Activated. There is also ScriptBody that allows you to inject script code right there.

2. RegistrationType – being declared as a FileType, currently this will work with docx. Feel free to experiment and extend out.
Also, seeing this action is activated on a list item, we need to track what list it came from {ListId} and the item in that list {ItemId} which is an integer.

1 <Elements xmlns=""> 2 <CustomAction Id="MicksPDFScript" 3 ScriptSrc="~site/_layouts/WordAutomationServices/Scripts/MoveItMoveIt.js" 4 Location="ScriptLink"> 5 </CustomAction> 6 7 <CustomAction Id="MicksPDFConverter" 8 RegistrationType="FileType" 9 RegistrationId="docx" 10 Location="EditControlBlock" 11 Sequence="106" 12 Title="Convert to PDF" 13 ImageUrl="~site/_layouts/WordAutomationServices/Images/pdf.gif" 14 > 15 <UrlAction Url="javascript:MicksOpenDialog('{ListId}','{ItemId}');"/> 16 </CustomAction> 17 18 </Elements> 19

3. Code that actually does the work – is pretty simple really with Folders and entire Document Libraries able to be passed to the Conversion Job.
One annoying thing is that below in Line15, conversionJob.Start() is called, really a job gets created and added to the Job Timer queue. Regardless of what goes on, the Started property always returns true.

Typically I’ve found the Timer Job to kick in every 5 mins to process the conversions and eventually a PDF file is seen in the library.

1 public bool Convert(string srcFile,string dstFile) 2 { 3 4 //create references to the Word Services. 5 var wdProxy = (WordServiceApplicationProxy)SPServiceContext.Current.GetDefaultProxy(typeof(WordServiceApplicationProxy)); 6 var conversionJob = new ConversionJob(wdProxy); 7 8 conversionJob.UserToken = SPContext.Current.Web.CurrentUser.UserToken; 9 conversionJob.Name = "Micks PDF Conversion Job " + DateTime.Now.ToString("hhmmss"); 10 conversionJob.Settings.OutputFormat = SaveFormat.PDF; 11 conversionJob.Settings.OutputSaveBehavior = SaveBehavior.AlwaysOverwrite; 12 13 conversionJob.AddFile(srcFile, dstFile); 14 15 conversionJob.Start(); 16 return (conversionJob.Started); 17 18 }

A couple of screen shots in action:




Of course this is not production ready, but it should give you a great start in getting there. To start, simply install and Activate the Feature on a site collection to see the functionality.

Go to a document library and activate the item drop down to see the Convert to PDF option. Must be a DOCX file currently.

Grab the VSNET2010 Solution and go for it – have fun.

Monday, 20 September 2010 21:06:48 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [2] -
.NET Developer | SharePoint | 2010
# Wednesday, 28 July 2010

Olaf and I were cracking away on some SharePoint 2010 work which we thought should be simple…point SPMetal to the site and start LINQ-ing to our hearts content…..

with the one exception that we couldn’t select items from a list based on their Content Type.

By default SPMetal.exe doesn’t include these ‘system’ fields (apart from ID + Title – go figure) and the secret is to use an Override file.

The good oil is:
(Here’s a good article on how .NET Types are mapped to SharePoint -

The simple override/parameters file:

<Web AccessModifier="Internal" xmlns="">
  <ContentType Name="Item" Class="Item">
    <Column Name="ContentType" Member="ContentType" />


The SPMetal Command Line


The VS.NET Code

 static void Main(string[] args)
            using (BreezeDataContext dc = new BreezeDataContext("http://breezelocal"))
                var myitems = from i in dc.GetList<ContentListTraining>("My Content List")
                              where i.ContentType == "Training"
                              select i;
                var courses = myitems.ToList<ContentListTraining>();

                Console.WriteLine("There are {0} items",courses[0].Title);

Wednesday, 28 July 2010 18:10:19 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [0] -
.NET Developer | SharePoint | 2010 | Tips
# Wednesday, 10 February 2010

I got an email from David Marsh telling me about this new world from MS. Let me share a little…
Way back when…LOGO was one of the first languages I learnt as a kid.

Moving a turtle around on a page with commands such as PenUp, PenDown, RightTurn etc etc – pretty cool as a kid and then you could draw things (there was a big version of the Turtle that interfaced into an Apple II via a ribbon cable as wide as a 4 lane highway)

MS Dev Labs have released a great SmallBasic environment that is very simple to pickup (great for kids).
It’s got a very simple set of commands AND it outputs straight to Silverlight.

Pretty quick ways of building silverlight apps….nice!

Check out –only if you have some free time smile_wink

Wednesday, 10 February 2010 14:39:38 (AUS Eastern Daylight Time, UTC+11:00)  #    Comments [0] -
.NET Developer | General | Silverlight | Tips
# Friday, 15 January 2010

Many times in BizTalk land we work with Schemas that are nested and have several related Schemas that are Imported from URL locations etc.

When you include these schemas and deploy to Production, you find out that the BizTalk server doesn’t access the Internet directly. Hence all the schema Imports fail.

You’ll then go and try hand edit the Imports, downloading the referenced Schema and try and Mash up something that refers to local files and no URL based Schemas. It may or may not work…till the next update…

I recently came across a handy set of free tools that take all the pain out to do with Schemas –>

Xml Help Line

Which has Xml Schema Lightener, Xml Schema Flattener

Another very handy tool not to leave home without.


Friday, 15 January 2010 09:43:18 (AUS Eastern Daylight Time, UTC+11:00)  #    Comments [2] -
.NET Developer | BizTalk | Insights | Tips
# Saturday, 14 November 2009

Recently at the SharePoint Conference (SPC2010) delegates were given a beautiful book with all sorts of developers bits.

The book stars 123 pages of great information, and improvements to many areas that we previously had pain with (lists, queries, and just CAML in general)

There’s also 6 walkthroughs (sort of like HOLs) with code etc. to give you a feel for customising SharePoint.

Grab the PDF version HERE













Some snippets which I found interesting from the book are:

  1. Some great object model options now for integrating with SharePoint.

    Points to note here:
    - Client OM + Rest are exposed as WCF Services (based on Client.Svc) and the Client OM is a batched model, so you transmit only what you ask for within Object Collection Hierarchies (unlike SPSite.AllWebs etc etc)
    - LINQ to SharePoint is initially created with SPMetal to create all the LINQ classes (there’s no ‘designer’ support for this yet, like LINQ for SQL – at least in this beta)
    - External Lists are an interesting one, you can develop plugins to expose two-way data syncs within SharePoint. I’m looking to reach out to SAP + Siebel systems when I explore this option :)
  2. Resource Throttling is turned on by default – previously developers could write code like SPList.Items… Usually on a Developer’s machine, with 5 items in a list this was not an issue, 8000 items in a list turns into a different story.

    SharePoint 2010 now has safe guards against this turned on by default.

Enjoy…I’m off to enjoy the sun.

Saturday, 14 November 2009 16:53:07 (AUS Eastern Daylight Time, UTC+11:00)  #    Comments [0] -
.NET Developer | Office | Deployment | 2010 | Silverlight
# Friday, 04 September 2009

Great to see some public information surfacing around SharePoint 2010 and development.

To get started (if you’re not already) here’s the SDK with a CHM file and PDF/XPS on “how to customise the ribbon”.

Doing a little digging in the CHM file, you can see (below) all the different Content categories with some special areas to note:

  1. There appears to be a Visio Server – I guess like InfoPath + Excel Services as they currently stand in 2007.
  2. AJAX + JSON seem to make an appearance at the foundational core – yay! less page reloads.
  3. WCF Services used (*.SVC) as expected and simplified. Also it appears that BDC systems are accessible via a SharePoint custom WCF Binding, making it possible to work on BDC based data from various applications within SharePoint. SharePoint might become the hub ‘repository’ for this sort of information.

Bear in mind *alot* of this information is ‘subject’ to change.

Certainly going fwd it should be very exciting to see what actually ships and whether some of the immediate constraints are dealt with.

Looks like we’re up for another Ribbon experience in this Version of SharePoint from within the Browser.


Friday, 04 September 2009 00:54:21 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [0] -
.NET Developer | MOSS | Office | Deployment | 2010
# Saturday, 11 April 2009

Well folks, hot on the heels of the MMA Contender's another Series which I'm sure will cause a stir.

Check out part#1 and part#2 below

Saturday, 11 April 2009 07:30:12 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [0] -
.NET Developer | Azure
# Friday, 10 April 2009

Hi guys,

If you're looking to get into how to host WCF Services on Azure, showing some cool graphics, then these samples are for you.

Silverlight v3.0 (beta), and important samples showing how to take your existing WCF Services and hosting/housing them in Auze (there's a few gotchas - and these samples have work arounds :) )

Grab them here -

Friday, 10 April 2009 15:29:20 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [0] -
.NET Developer | Azure | Silverlight
# Sunday, 05 April 2009

Yep - hot off the press thanks to Stephen a friend of mine (who brought this to my attention)

Don't leave home without it.

Sunday, 05 April 2009 18:55:07 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [0] -
.NET Developer | MOSS | Admin
# Thursday, 12 March 2009

You want to know the ins and outs of WCF at a glance – then the mini-book is a winner.
(Just let this puppy fall out of your back pocket in the office and watch the guys instantly want to Sync up their Complete Series of Star Trek with you…)

Seriously – great guide, easy to flick through and welcome to another 8 million lines of code you thought you could live without :)

Thanks to the efforts of Cliff Simpkins and his team for their dedication on this.

6 Chapters + Code….are you man enough?

WCF Channel Stack

Thursday, 12 March 2009 21:25:56 (AUS Eastern Daylight Time, UTC+11:00)  #    Comments [0] -
.NET Developer | BizTalk | Tips | WinWF
# Monday, 05 January 2009

Cant do

“The product cannot be installed on this machine since it seems to be a domain controller”

What a start to 2009!!! – the above dreaded message when trying to (in this case) install BizTalk RFID on a DC.
For me – this happens quite a bit, as I’m building up a proof of concept, a demo, something to show and present with.

I always…always….forget to install BizTalk RFID bits before I promote to a DC (this technique can also cause security acct issues after the machine has been promoted to a DC – depends on how the authentication is setup etc)

NOTE: BTW – Installing BizTalk RFID on a DC is NOT SUPPORTED (had to put that one in their – keeps both sides happy)

For love or money I’ve bounced this question around for a while and come up empty, until…today!!! Niklas Engfelt a senior MS support engineer came to my rescue (he famously provided those thoughts from left field which were on the money! Big thank you Niklas)

He suggested grabbing Orca from the Platform SDK and having a browse through – I’d used HEX editors, disassembled files, attached process monitors during installs and looked through any config file with a fine tooth comb…but I’d never tried a MSI Editor.

The steps to Enlightenment: (changing the installer validation conditions)

  1. Grab a download of Orca from here (I didn’t have the platform SDK currently installed and wasn’t about to install 1.2 GB worth either) and follow default install prompts.
  2. If you haven’t done so already copy the RFID_x86 or RFID_x64 folders off the install media to a temp folder nearby (note: sometimes on Win2K8, the system prevents copied files from being accessed until an admin comes along and says ‘these are ok’ by going into File->Properties on each file. It’s weird I know, but I get it every now and then)
  3. Locate the RFIDServices.msi under the RFID folder and you’re ready to go.
  4. Launch Orca and open RFIDServices.msi to get something like:
  5. image
  6. Under the Tables Column select LaunchCondition and drop the 2nd Row as follows:
  7. image
  8. Drop the Row and Save the MSI file again.
  9. Run Setup.exe as per normal.

Oh what a sweet day!

p.s. I’m sure you’d be able to employ this technique onto other MSI’s causing grief.


Monday, 05 January 2009 14:18:43 (AUS Eastern Daylight Time, UTC+11:00)  #    Comments [2] -
.NET Developer | BizTalk | RFID | General | Tips
# Tuesday, 23 December 2008

With all the developer extensions in recent time around SharePoint (Features, Solutions etc), I've found there seems to be a few little known and little used 'other' APIs within the SharePoint space.

We've got things like WebServices and the SharePoint Object Model (SPSite etc) that we use however, there's a couple of other APIs that could be useful also for the times when you're not running locally on the SharePoint machine - they generally center around HTTP and extending it.

Two (that come immediately to my mind) are:

1. WebDav - early versions of 'Web Folders' used this.

2. RPC (over HTTP) APIs - Front Page and SharePoint Designer still use these.
(InfoPath when submitting forms uses this to promote properties to a forms library)

A great example of this is SharePad for SharePoint on CodePlex

Merry Christmas,


Tuesday, 23 December 2008 15:34:03 (AUS Eastern Daylight Time, UTC+11:00)  #    Comments [0] -
.NET Developer | MOSS | Office
# Wednesday, 12 November 2008

Ever wondered how you might implement "Hello World" in a non-domain specific language such as in the roots of Oslo looks like.........(I found a snippet from one of the PDC Webcasts....)


(This is written in a tool/shell that ships as part of the Oslo SDK - Intellipad)
The left hand side is the instance document; the middle is the grammar or transformation; and the right is the Output Graph.

This is a pretty specific sample - as in fact its very specific and only takes one input - "Hello World" (as dictated on the syntax line)

What's so special about all of this?? I hear you ask.....

There's a huge amount of power in being able to 'model' your world/data and relationships. Today we're pretty comfortable with XML but we also have to tolerate things like parsing '<' or attributes etc. Or if you've ever been given a schema full 100s of fields when you needed to use just 5. XML is not perfect, but it certainly has its need.

Storing this sort of XML in the DB I think is painful at best, while SQL 2005/2008 goes part way towards helping us, there's still a bunch of specifics that the DB needs to know about the XML and if that Schema changes, then that change goes all the way to the DB....otherwise the alternative is Tables/Rows/Columns + invest in Stored Procs to manipulate the data.

Enter the Modeling Language -M

We can basically define our world - if you're dealing with a customer with 5 attributes, that's all you specify. You could take your V1.0 representation of a Customer and extend it etc etc.

Deploying the model is deployed straight to the Database (known as a Repository) - the deployment step actually creates one or more tables, and corresponding Views. Access is never granted to the raw table, only to the View. That way when you extend or upgrade your models, existing clients see just their original 'View' keeping the new attributes separate.

So in terms of building a model of the information your systems are utilising -> 'M' is a very rich language, which decorates and defines a whole bunch of metadata around your needed entities.

Digging a little deeper into M.... we have 3 main defining components:

1. M-Schema - defines the entities + metadata (e.g. unique values/ids etc)

2. M-Grammar - defines the 'Transformation'. How to transform the source into the output. You could loosely look at this as:  Y=fn(M) "Y equals some Function of M"

3. M-Graph - a directed graph that defines the output (they use directed graph to indicate through lexical parses, that something has a start and definitely finishes.This is a check the compiler will do)


You'll notice at the top of this shot, there are DSLs - these are Domain Specific Languages. e.g. a language full of terms and expressions to define for e.g How to work with You local surf clubs competitions; another could be How to Manage and describe Santa's Wish list.

You might be thinking....I go pretty well in C#, why should I look into M?? C# is obviously a highly functional language that when you start coding you've got all the language constructs and notation under the sun at your disposal - decorations are done through attributes on methods/classes etc.....but modeling something still in done in a pretty bland way e.g. structs, classes, datasets, typed datasets etc. You're starting with a wide open language that really without you creating a bunch of classes/code doesn't have methods like Club.StartCarnival.....

M - take what you're describing, a carnival and model it. What entities are in a Carnival (people, lifesavers, boats etc) - model this - give us a picture of what they look like (data you'd like to hold and the relation ship), define a grammar (words, constructs and operations) on how we can work with these entities - we now have a Surf DSL (that of course can be extended to V2.0....)

Developing solutions against the Surf DSL - the compiler knows all the defined commands, constructs and schemas (cause we defined them in our DSL). They're the only operands that you can use as a developer - this simplifies the picture immensely.

The beauty about M is that the DSL is simply deployed to a Repository (which at this stage is SQL Server, but could be any DB as we get access to the TSQL behind the scenes)

As I dig a little deeper I'll be illustrating with some samples going forward - hope you enjoyed this post....for now :)

Lastly - it's amazing that way back at Uni, I studied a subject called 'The Natural Language-L' (and it was one of those things where I thought - I'd never use this again....well guess what 15 years on....M is looking very close. How I even remember this is even scarier!!!) - the subject was language agnostic and dealt with what was required to create/specify a language that could be learnt, written and interpreted.


Wednesday, 12 November 2008 22:34:12 (AUS Eastern Daylight Time, UTC+11:00)  #    Comments [0] -
.NET Developer | Oslo | PDC08
# Monday, 03 November 2008

Folks - while setting up some IIS servers for a BTS production environment I came across this handy little tool.

Basically gives you a Tree View of what things you'd like to install on your IIS Web Server from MS (mother ship).

Includes things like Service Packs, etc etc. - handy spot to grab all the new files in one spot.
(as opposed to the piece meal approach - of install, oh you need the .net 3.5 framework - also need SP1 ..with maybe a few reboots inbetween)

- single place for all the tools and other components that you'll need.

- great way to do them all at once.

Here's what you're after -


Monday, 03 November 2008 23:02:46 (AUS Eastern Daylight Time, UTC+11:00)  #    Comments [0] -
.NET Developer | General | Tips
# Thursday, 11 September 2008

The system we built has made it through its maiden event and was still capturing reads well into the later afternoon (until we got round to tearing it down....technically called 'Bump Out'....with all the moving bodies and parts, it's no wonder they call it Bump Out!)

 I grabbed a couple of SilverLight screen shots to show what the system is capable of - in the hectic pace of last week I didn't manage to grab some screen captures of the system in action, these screen shots come from the courtesy of Eileen Brown's Blog (she is responsible for running MS Events in the UK + a founder/advocate for Women in IT)

Walk-In Displays - these walk in displays were up on the big screens as delegates entered/exited their sessions. Pretty cool!!!.

These screens are delivered via a browser and are what we call the 'Walk-in' Display. Here you can see 3 people leaving the room with the graph in the background showing some delegate profiling data around attendance of previous TechEds.

Here we've got an enter and a leaving of the session. Something we didn't get time to do at this show was to play on the scope for customisations with these avatars. We had over 120 textures + bitmap type surfaces set for this, but during the show this 'feature' got bumped further down the list. (Hats, scarves, hair type, colours etc. you know the stuff)

We had fun with a couple of names though - '@Coatsy' was one, 'The Stig' was another.

The beauty about these screens was that people outside the conference got real time stats about the rooms and could see the 'Walk-in' displays in near real time. (Late night trouble shooting with my friends in MS Corp - this proved a great tool)

In testing performance of our SL Services over the internet - I had a link to the UK where we had a technician monitoring the various walk-in displays and giving feedback. All worked pretty well.
(At this point we don't have an upper limit on the number of individual 'Walk-in' display sessions that run concurrently - each open browser receiving events in near realtime is an additional WCF Service instance + a SQL connection. Not sure how much benefit SQL Connection pooling will give as these connections are active pretty much all the time)

This screen is from the 'Speaker Charts' which are designed to give the speaker various breakdowns of up to the minute information of their audiences.


Overall the Breeze Boiler room (HQ) got great attendance from the delegates wanting to know the "whats/whos/whys" on the Breeze Event Tracker System.

We're currently still analysing the results but some interesting numbers are:

(1) In a 16 hr period for one room, we got 345000 reads.......(this maybe picking up the persons in the back row while sessions are on - our business logic takes care of these)

(2) We experienced a very particular 'known' problem (don't you love it when you experience an issue for the first time and describe it, only to be told it's 'known' - well telling us that ahead of time would have been great :). The problem arises from Tags being physically close together, and two tags respond 'around' the same time. In very special circumstances this confuses the Reader and instead of getting 12byte TagIDs we got 16, 18 or sometimes 20 byte IDs where the 2 tagIDs were 'spliced'.

It occurred in very special cases - but we got it. That particular read should be discarded as it fails the CRC check.
In peak time, out of 8000 reads we got around 2 of these cases.

Couple of phone calls to India and our Intel R1000 Provider was 'patched' and as a PlanB we had the current provider being wrapped by another .NET class to catch that particular exception.

(3) SCOM2007 couldn't have worked better!!!! I dropped on the BizTalk RFID Mgmnt pack and it was a breath of fresh air. All the Readers, Devices, Processes, Providers and RFID Servers out on the network appeared as healthy items in lists (mostly). From the mgmnt pack I was able to see the number of Tags Read, settings, when the last heartbeat was heard etc etc. from all the devices over the conference - certainly Mission Control.

(4) We had various 'Show' type issues such as power cords being unplugged; cables being cut; cabinets that housed the equipment in each room all in all it was filled with fun and excitement. We did have a couple of Network issues where at the conference there were several networks implemented for different regions/events at the conference. e.g Public Delegate WiFi; Networks within each of the Break out rooms - we were on our own VLAN and these network layers above us, proved a little troublesome from time to time.


Various Licensing arrangements of this system are available - from the software components through to the hardware. Feel free to ping me for more details.

Here's a video of a screen capture that I *did* manage to do.

Thursday, 11 September 2008 21:54:16 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [0] -
.NET Developer | BizTalk | Insights | RFID | BET
# Thursday, 07 August 2008

Wow! I had to venture into the 'cave' and solved this problem - talk about a character building experience!

I'm currently building a Mobile BizTalk RFID 1.1 solution for TechEd08 that runs on a PPC with a Kenetics CFUHF Reader.

*** Early Screen Shot *** :) image

So in building out this application the details always bring unforeseen challenges to light:

1) The application houses all the BizTalk RFID pieces (providers, device proxies etc) so registration, and starting/stopping providers/device discovery and applying properties to the device needs to be all taken care of.

2) I built an RFID Mobile Provider for the Kenetics device - I worked with their support engineers solidly for a week to build what I needed. I took a trip down memory lane and have had enough pinvoking to last till Christmas.

3) The app also manages a several local SQLCe databases - one for my app, the others for the operation of BizTalk RFID Mobile locally on the device (mainly for it's OOTB store/forward mechanism).

After weighing up several options in this solution and how to get data to/from the device reliably I decided to go with SqlCe Merge Replication as we needed to push/pull data from several tables and schema changes.

4) Which leads me onto one of the most little known items......

How do I setup SqlCe Merge replication? it's a mine field, change something here and boom over there.

The picture

Phase 1:

Forget ISA for the moment. If you can, aim to get replication running in a local environment first (e.g. Local LAN on same network, through VPNs etc)

Getting the SQL bits Setup
Ok - the pieces to the initial puzzle.....

  1. Sql Server Side
    1. Sql Server and it's additional Sql Mobile Replication Bits - download from here.
    2. IIS to expose a replication 'end point' where the remote devices will connect to and replication will take place through. IIS can be separate out onto a different machine.
    3. As in my case, somewhere that the 'snapshot' DB information will live to merge down to the devices. Mine was a UNC share - SQL created this after I completed the Publication wizard.
    4. Installation -You want the SQL Server Compact 3.5 Server Tools installed on BOTH the IIS AND SQL Machines (if these are one and the same, then you only need it once)
      The server tools has two main components - one being the bits that drive IIS and the other being a wizard that configures the exposed virtual directory and sets security onto it.
      If IIS and SQL are on separate machines, the easiest way to go is:
      get SQL to publish the snapshot to a UNC share e.g. \\sqlserver\data
      - On the IIS box, run the Configure Web and Synchronization Wizard (installed with the above server tools) and a later screen will ask you where this data is coming from - simply point to the UNC share.
  2. Mobile Device Side
    1. The equivalent SQL Mobile Replication tools need to be installed (above and beyond just normal SqlCe database components install) - SQL Server Compact 3.5 for Windows Mobile
      *** NOTE: make sure that the bits on both the Mobile + Servers all match ***
  3. Server Side Security - For this let's work backwards, from the publication through to the exposed endpoint.
    1. Publication Security - this is set through the Publication Access List within SQL Mgmnt Studio
      The group in question is the ExhibitorsGroup

      Create a publication within the SQL Management Studio

      (Publication General Properties)

      (Snapshot Properties - note the file location)

      (FTP Snapshot + Internet - I've just used Internet and no IIS server name as this is configured in the Mobile Wizard)

      (Publication access list - I've blanked out sensitive info, but you can see the BETDEV\ExhibitorGroup being manually addded to the list)
      The rest of the publication settings are defaults - for me anyway.
    2. Let's go to the UNC share - = C:\Public\Exhibitor.SqlCE.FileShare
      This is the UNC share that IIS repl component will connect to at the back end.
      Note: the BETDEV\ExhibitorsGroup obviously needs r/w access to this folder.
    3. Let's run the 'Configure Web and Synchronization Wizard' to configure the IIS component.
      (you'll find it off the tools menu after you've installed the Mobile Server Tools from the links above)
      Note: one of the interesting things I found here is that after running the wizard, I normally go a tweak a few things in IIS - directory browsing etc. As a rule of thumb, if you want to change something with the Virtual Directory that is created at the end of this wizard, re-run the wizard to do it!!! :)

      Press next if prompted with the welcome screen note my options here - SQL Mobile and press Next.Cool

      Select the site and Create a Virtual Directory (I'm re-running the wizard so I'm going to select Configure Existing). Press Next.

      I created an alias of SqlCERepl directory and accepted a sub-directory under the SqlMobile dir.
      (you can change this, but looking around the forums it was a source of grief - I could do without :) )

      Here - I selected HTTP and not HTTPS access to the VirtualDirectory (and SQL Service agent).
      I did this as if you remember the diagram at the top of this post, ISA will serve as the HTTPS endpoint and will fwd the request via HTTP to our IIS/SQL box.
      IF you do want to change from HTTP to HTTPS or visa versa - re-run this wizard. Save you about 4 hrs of head banging.
      Click Next when ready.

      On this page - I selected 'Authentication required' and not anonymous. This has something to do with the data that I'm replicating as I'm using a Filter based on 'UserName'. So in my case, the username that the devices connect with will be my differentiator (I looked into using something like 'deviceID' but didn't get too far)
      Click Next.

      Select the type of authentication to be made against IIS - I selected NTLM (basic is fine also - but you need to be mindful that we're using HTTP at this point)
      Quick note on Security: So far, we've got 2 areas that need authentication.
      1) the IIS virtual directory and 2) accessing the actual SQL Publication in the UNC share and SQL Publisher Access List.

      So if the two machines are separated (IIS + Sql), NTLM will no transverse these machines (known as the 'double-hop' problem) so I'm assuming Basic or Kerberos is the safer bet.
      Click Next when ready.

      On the Directory Access Screen note the presence of the ExhibitorsGroup and also this publication is accessing the UNC Share we created earlier.
      Next to continue.

      UNC path specified - here you can see how this could be pointing to this SQL Share sitting on another machine as in the 2 machine hosted scenario.
      Click Next and Finish to see something like:

      You're virtual directory is now configured.
      To test your configuration so far go to:
      /sqlcerepl/sqlcesa35.dll?diag" temp_href="http:///sqlcerepl/sqlcesa35.dll?diag">http://<server>/sqlcerepl/sqlcesa35.dll?diag - diagnostics screen to get something like:
      You should be prompted to login - enter account details that have access.
      This is our fallback screen - next we will configure the ISA component and come back to our test screen to make sure.
      You're done - here. :)
  4. Configure ISA Server
    ISA server will be the bridge between our public SSL access and our internal IIS/SQL Server. We would effectively like ISA to simply route the request and pass it through without to much tampering with our good packets.

    ISA Server is on IP address: IP:Y_Internal
    The Internal Server here is :
    The public Interface on the ISA Server is for our purpose known as IP:X_Public
    and it's FQDN is : (in otherwords - this is the public DNS name that will point to the public interface of your ISA box)

    NOTE: Make sure you have your SSL cert ready - I created an inhouse cert from a standalone cert server.
    You need at least a 'Server Authentication' Certificate to apply within ISA.
    (I'll show you a little trick in the mobile app to get round the fact that the certificate is from a non-trusted Cert. Authority by default)
    The friendly name on the cert should be - '' (without the quotes)
    All this keeps SSL happy.

    1. Create a publishing rule in ISA 2006 that will effectively route all requests coming to the public interface to our internal IIS/SQL Server.
    2. Fire up the ISA MMC and create a New Web Server Publishing Rule - I've called this sample rule, "Public to Internal IIS/SQL Repl"

      Click Next when done.
    3. Rule Action - set to Allow
    4. Publishing Type=Single Web
    5. Server Connection Security - SSL.This means that SSL is going to be used over the public network.
    6. On the Internal Publishing Details - I tend to hardcode the IP address in, just to reduce any ambiguity.
      Note the IP address - internally acessible only. 10.x.x.x
    7. Further settings on the Internal Publishing Details
      NOTE: the option of fwding the original client host headers to the internal IIS/SQL (I found a variety of incomplete  HTTP Header details errors attempting to sync if I cleared this checkbox)

      We also can restrict the access on this rule by specifying the path of /SqlCeRepl/* (this is obviously the Virtual Directory created earlier)
    8. Fill in your public DNS name - don't worry that the wizard screen is showing and NOT
    9. Create a listener (if you need to ) as follows:
      (I've modified the screen shot slightly - from my listener)
      Note the ports: 8443 that SSL requests is coming on. You can use 443 if you prefer, I had other things on those ports)
      Also - I setup NO Authentication and replication works. You *could* try setting up Basic Authentication here and using Delegated Authentication (ISA server will login to the IIS/SQL box on your behalf with the inputted security credentials).

      I've also supplied the Certificate here as well (add your cert to the machine store ahead of time)

      A way to test if your auth is going to work - fire up your browser and try /sqlcerepl/sqlcesa35.dll?diag" temp_href="http:///sqlcerepl/sqlcesa35.dll?diag">http://<server>/sqlcerepl/sqlcesa35.dll?diag

      You should be prompted for login details ONLY ONCE. If you need to supply them twice and then you see the diagnostic page, your mobile application replication will fail :-(. Once and once only.
    10. Authentication Delegation- we want the client to auth. directly against the backend (routed through ISA of course :) )
    11. User Sets - because we don't have authentication here, ISA can't determine users, so All Users is our only option.
    12. What a glorious site....almost done......
      Click Finish to complete the wizard.
    13. Right click on the rule just created and select Properties - we need to change the Link Translation to OFF
      This was the major source of my grief - I kept getting 'HTTP Headers malformed...' ERROR:28035 when trying to sync from the Device - yay!

      I was fortunate to be able to contact a friend of mine Darren Shaffer (Mobile MVP) that explained what was required to be sent back/forth in the headers during the conversation - big thanks there Darren!
    14. You should be able to browse to /sqlcerepl/sqlcesa35.dll?diag" temp_href="https:///sqlcerepl/sqlcesa35.dll?diag">https://<yourserver>/sqlcerepl/sqlcesa35.dll?diag - it should WORK :)
      If not - resolve before moving on. (you may get IE grumbling about the Certificate being invalid if it's an inhouse cert)
  5. Configure the MOBILE replication piece!!!
    1. Make sure you have installed the SQL CE 3.5 Core + Repl CABs at least.
    2. On the mobile device, I tend to have routines to Add and Remove DB Subscriptions as I found that if any publication changes on SQL Server happened - e.g. a field was modified, or a table added/removed from the Publication, then Merge Repl would fail even though it previously was working.

      Easier to Remove the Subscription on the local SQLCE db, and then add it again.

      Note: InternetUrl = " temp_href="https://">https://<>
      Username + pass must be a user that has access to all the bits we configured above. In my case, someone who is a member of the ExhibitorsGroup.

      The code looks like this:
         1:   public void AddReplAndSync()
         2:          {
         3:              //using System.Data.SqlServerCe;
         4:              bool bAddRepl = false;
         5:              try
         6:              {
         7:                  if (DoDBLookup("SELECT count(*) as cRow FROM __sysMergeSubscriptions WHERE Subscriber='ExhibitorSubscription'", "cRow") != "1")
         8:                  {
         9:                      bAddRepl = true;
        10:                  }
        11:              }
        12:              catch 
        13:              {
        14:                  bAddRepl = true;
        15:              }
        17:              SqlCeReplication repl = new SqlCeReplication();
        18:              repl.InternetUrl = AppSettings.Settings.ReplServer +  "sqlcesa35.dll";
        19:              repl.InternetLogin = AppSettings.Settings.ReplUser;
        20:              repl.InternetPassword = "XXXXXX";
        22:              repl.Publisher = AppSettings.Settings.ReplPublisher;
        23:              repl.PublisherDatabase = AppSettings.Settings.ReplPubDB;
        24:              repl.PublisherSecurityMode = SecurityType.NTAuthentication;
        25:              repl.Publication = AppSettings.Settings.ReplPubName;
        26:              repl.Subscriber = AppSettings.Settings.ReplSubName;
        27:              repl.SubscriberConnectionString = string.Format("DATA SOURCE='{0}'", ESDAL.GetDBPath());
        29:              try
        30:              {
        31:                 if (bAddRepl)
        32:                     repl.AddSubscription(AddOption.ExistingDatabase);
        33:                 CloseAllDBConnections();
        34:                 repl.Synchronize();
        35:              }
        36:              catch (SqlCeException e)
        37:              {
        38:                  MessageBox.Show(e.ToString() + e.NativeError.ToString());
        39:              }
        41:          }
        43:          public void ReplRemove()
        44:          {
        45:              CloseAllDBConnections();
        46:              SqlCeReplication repl = new SqlCeReplication();
        47:              repl.SubscriberConnectionString = string.Format("DATA SOURCE='{0}'", ESDAL.GetDBPath());
        48:              repl.InternetUrl = AppSettings.Settings.ReplServer +  "sqlcesa35.dll";
        49:              repl.InternetLogin = AppSettings.Settings.ReplUser;
        50:              repl.InternetPassword = "XXXXXX";
        51:              repl.Publisher = AppSettings.Settings.ReplPublisher;
        52:              repl.PublisherDatabase = AppSettings.Settings.ReplPubDB;
        53:              repl.PublisherSecurityMode = SecurityType.NTAuthentication;
        54:              repl.Publication = AppSettings.Settings.ReplPubName;
        55:              repl.Subscriber = AppSettings.Settings.ReplSubName;
        56:              try
        57:              {
        58:                  CloseAllDBConnections();
        59:                  repl.DropSubscription(DropOption.LeaveDatabase);
        60:              }
        61:              catch (SqlCeException e)
        62:              {
        63:                  MessageBox.Show(e.ToString() + e.NativeError.ToString());
        64:              }
        65:          }
        67:          private void CloseAllDBConnections()
        68:          {
        69:              if ((_dbCon != null) && (_dbCon.State != ConnectionState.Closed))
        70:              {
        71:                  _dbCon.Dispose();
        72:                  _dbCon = null;
        73:                  GC.Collect();
        74:              }
        76:          }

Trick to deal with Inhouse generated certificates -
Within your mobile app we create a class that essentially returns True when asked 'Is this Cert. valid?'

Somewhere upon starting up your app - e.g. Form_Load - insert LINE#1 below.

LINE#3 onwards describes the class 'MyCustomSSLPolicy'

   1:  System.Net.ServicePointManager.CertificatePolicy = new MyCustomSSLPolicy();
   2:  ......
   3:  using System;
   4:  using System.Collections.Generic;
   5:  using System.Text;
   6:  using System.Net;
   7:  using System.Security.Cryptography.X509Certificates;
   9:  namespace MicksDemos.Utilities
  10:  {
  11:      public class MyCustomSSLPolicy : ICertificatePolicy
  12:      {
  13:          public bool CheckValidationResult(ServicePoint srvPoint,
  14:          X509Certificate certificate, WebRequest request, int certificateProblem)
  15:          {
  16:              return true;
  17:          }
  18:      }
  19:  }

Closing note:

Hope you find this useful - I've done this a few times now and am amazed with the lack of info around this especially through ISA.

If you get any errors around "Can't contact SQL Reconciler..." etc errors - GENERALLY try and rebuild the snapshop server side, then try syncing again.

Nighty night!

Thursday, 07 August 2008 00:37:05 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [0] -
.NET Developer | RFID | Tips
# Wednesday, 06 August 2008

Folks - fellow MVP Richard Seroter has written a VERY comprehensive series around this very topic including the new BizTalk Adapter Pack V1.0 (V2.0 is in Beta at the moment).

Over 20+ thousand words + 178 screen shots - all for the love of BizTalk/WCF.

Complete with Source Code!!!

What a champion series - I'm looking forward to in tucking into some of his great material!

The BizTalk community is in debt to you Richard - well done!!!


Wednesday, 06 August 2008 11:06:02 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [2] -
.NET Developer | BizTalk | Insights | Training
# Monday, 21 July 2008

This one came from Paolo Salvatori (a senior PM within the MS Connected Systems Division Team... I know a bit of a mouthful) whom has gotten in touch with his creative side and drawn a picture for all us common folk :) - well done Paolo.

The scenario is - a Request/Response Port is published at the 'front end', goes through BizTalk and the work is done by a backend system that operates via a One-Way Send and BTS gets the response via another One-Way Receive.

The thing I like about Paolo's piece of work is that he shows all the Message Context Properties required to be set by BizTalk for message correlation.
Which makes this a Messaging Only Solution and NO Orchestrations required!!!! (how cool)


BizTalk Request Response Port


Click on the image to day I'll get Silverlight Zoom Composer control running for these.... :)

Monday, 21 July 2008 19:04:00 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [2] -
.NET Developer | BizTalk | Insights
# Sunday, 13 July 2008

After the more than normal pain in getting this done for my previous post, I decided to post the fruits of my labour (not labor that my wife tells me about watermellons and men wouldn't know the first thing about birth....I'm not about to do the pepsi challenge on that :))

- this is a stock standard Web.Config for a MOSS install NOT a plain WSS install (there's about 3 lines different from a plain WSS install to a MOSS install - mainly anything that references SharePoint.Publishing....)

Grab this and these are the changes that WORK!




Sunday, 13 July 2008 22:38:41 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [0] -
.NET Developer | MOSS | Silverlight

"Could not load file or assembly 'System.Web.Silverlight, Version=, Culture=neutral, PublicKeyToken=31bf3856ad364e35' or one of its dependencies."

You're starting to Roll your sleeves up and get dirty with Silverlight 2 Beta 2, load up some of the Silverlight Blueprint for SharePoint Samples, run the installer (it's great that most of these examples have an installer) and Boom! you get the above error!!!

Here's a list I've compiled to get Silverlight working - I'm currently running this on Win2008 IIS7

(1) Install WSS SP1/MOSS SP1 on your SharePoint box if you haven't already. You need the SP1 to support .NET 3.5 calls through SharePoint - my guess is that these tell SharePoint not to intercept the calls and let them go to their rightful owners.

(2) Create a 'dummy' site collection on a test Web Application - e.g. http://localhost:81 - This is so you can see all the changes to the web.config that are made through the installation process, in isolation. By keeping this separate to your usual web.config, you'll be able to merge changes at a later date.

(3) Install the Silverlight 2 Beta 2 runtime and other developer bits - From - VS2008 Developer Bits and just the runtime if you want from here

(4) Do one installation of a Silverlight for BluePrint Sample - the installer creates a 'virtual directory' under your Web Site called ClientBin where the various Silverlight 2 files go (*.js, *.XAP). This is a handy install so you can see what the directory execution settings are required to make this work through SharePoint. i.e. Execute permissions only. Take note of this directory.

(5) Add a IIS MIME type - With Silverlight 2 beta 2 - there is a new file type added which is a *.XAP file type. IIS by default doesn't know how to encode/translate or send these files down over the wire.
Add a mime type of: Extention: xap Mime Type: application/x-silverlight-app to your IIS Test Web Site

(6) Make Web.Config changes - there's a whole series of Web.Config changes to be made to your SharePoint Web Application to support AJAX/.NET 3.5 and now Silverlight.... fortunately other hard working folks have done this for you!!!! :) Bless their cotton socks! - grab the Feature that makes the modifications from here (** NB: you want the 3.5 config feature)

You're almost done........ :)

(7) EXCEPT for the error above!!! After much inspection of your system, you'll realise that you *don't* have that DLL (on a clean install). The Silverlight Ninja will know that this is from Silverlight 2 Beta 1 and not found in the Beta 2 kits!! Yay team!

The System.Web.Silverlight.dll is found in the Silverlight 2 beta 1 SDK - so download that puppy, extract out the DLL and either GAC it, or add it to your BIN directory on your SharePoint site. (I added it to my BIN directory - as I reckon when SL2 is released, this problem would have been resolved)
(**UPDATED: Due to how painful that was, I decided to package up the DLL for you - HERE**)

Here are the Compiled Files - FOR SL2 BETA 2 - they WORK!! :-)

(I grabbed the Blueprint Hello World Web Part and updated to work)

1. Silverlight Web Part DLL

2. Silverlight *.XAP updated for Beta 2, copy straight to the *sub-directory* under your client BIN

3. Sample SharePoint Web.Config with all the changes! :-)

Sunday, 13 July 2008 21:08:09 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [4] -
.NET Developer | MOSS | Silverlight
# Thursday, 10 July 2008

One of the handiest tools I've used in the last year -

If you're presenting, even just showing your code, screen to colleagues then this is superb

Thursday, 10 July 2008 20:50:48 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [0] -
.NET Developer | Other | Tips
# Thursday, 03 July 2008

You'll get this error when using WCF/IIS and host headers.....fortunately a fellow colleague Paul Glavich figured it out!!! Well done Paul! (It involves an IIS reshuffle, you may be able to do something within a custom WCF Binding.)

Remember: There is a limit on the number of IIS Websites you can have on a single machine.

Thursday, 03 July 2008 07:51:38 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [0] -
.NET Developer | BizTalk
# Tuesday, 01 July 2008

You might be wondering what do all these guys have in common....good question.... :-)

We're currently building an RFID enabled System where complex processes are handled by BizTalk Server, and data being pushed down to Silverlight V2.0 clients via a WCF Silverlight 'Eventing System' (which really is polling under the hood, but to us in developer land - it's cool and it's Events)

Scotty has the full write up of some of his learning experiences through this - well done Scotty, he's been in that place where there are no manuals, no documentation, no previous code, just a gut feel and a compass to sail the seas.

We demo-ed the system at our last user group (or more over used them a guinea pigs :)

Token Screen shot: (we've associated tags with people information and this is what is displayed when TagReadEvents are captured. We need a little work to avoid being underneath or on top of a previous animation)


Artists impression!
Tuesday, 01 July 2008 09:55:17 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [0] -
.NET Developer | BizTalk | Insights | RFID | Silverlight
# Saturday, 07 June 2008

Some pretty cool features as I've previously posted

From my perspective I'm particularly interested in the supported 'WCF Dual HTTP Binding'.......more on that later :)


SILVERLIGHT 2.0 BETA 2 SDK is now available!!!!

Grab it here from the Getting Started section

Also get the videos, hands on labs, training material from HERE

Saturday, 07 June 2008 10:50:21 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [0] -
.NET Developer | General | Silverlight

Jesse has written a great little article on creating a popup control in Silverlight 2.0.

Nice way of doing it - I was also thinking, that quite simply you could also set the ZIndex of the element to a positive value.


Saturday, 07 June 2008 10:40:30 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [0] -
.NET Developer | Silverlight
# Friday, 06 June 2008

I've started developing a RFID Mobile application on a Windows Mobile device - which is pretty cool.

I decided to use VS2008 and all worked well until deployment.
"Unable to load System.Data.SqlServerCe.dll ....Version=3.0.xxxxx " as the version OOTB with VS2008 is v3.5

So I grabbed SQL CE Mobile 3.0 and copied over and installed the sqlce30.wce5.armv4i.CAB and all is good!

Friday, 06 June 2008 13:31:18 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [0] -
.NET Developer | RFID
# Sunday, 01 June 2008

I was cracking into getting my machine setup for a Silverlight project that I'm working on and came up with the above error.

Now....I admit....running x64 Windows 2008 on my Fijitsu Laptop mighten be the best combination given the huge support for my laptop drivers that I have.

I installed all the new(er) Silverlight 2.0 Beta bits from (VS2008 Silverlight 2.0 Beta 1 Bits) and opened up my VS2008 seeing all the new Silverlight project types - cool! (I thought)

Each time I either created or opened an existing project - boom! up came the error.

So I figured the installation didn't complete properly.........after running/re-running/uninstalling/installing countless times the error was still there!!!!

My one solace and saving grace was running the following command line:

devenv /setup

......"I'm on my way, on my way to happiness today.....ah huh ah huh ah huh"........


Sunday, 01 June 2008 21:25:53 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [2] -
.NET Developer | Silverlight | Tips
# Tuesday, 06 May 2008

Hi folks,

While freezing in NZ (this week) I came across this this great MSDN article discussing some of the lower level implementation details around .NET 3.5 Framework.

The part that interests me is the Presence information (right at the end of the article) where once a connection is setup, you can get presence information about the other party - right from the .NET 3.5 framework.

If you've ever had to try and develop for that other ways i.e. by talking straight to communicator, or messenger or... etc.

You'll realise that they each have a slightly different API set, (some accept SIP, some don't, some require it, some don't...) and it's opening up trouble - cause on the target deployment machine...can you imagine the production guys when you say "hang on, I've just got to go and download Messenger (from Live)...."

Anyway - here's the article.

Enjoy -

Tuesday, 06 May 2008 17:05:16 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [0] -
.NET Developer | General | Tips
# Monday, 25 February 2008
Fellow MVP Thomas has provided a sample that delves into the dark depths of BizTalk Server and its Adapter configurations.

Check out his Custom Prop Page Sample

Thanks Thomas!

Monday, 25 February 2008 16:40:29 (AUS Eastern Daylight Time, UTC+11:00)  #    Comments [0] -
.NET Developer | BizTalk | Insights | Tips
The new 'Adapter Framework' (in BizTalk speak - this would be the Adapter V2.0 Framework) is now available.

This new WCF based Adapter Framework allows developers to build, deploy and execute standalone Adapters - whether BizTalk Server is present or not.

The framework is designed very much for standalone application (& can be 'plugged' into BizTalk R2 if desired), and as a .NET developer you can consider this as an additional .NET library that provides the abilility to allow you to build standalone adapters for your .NET applications (e.g. console app, or Word)

p.s. The BizTalk Adapter Pack is built ontop of this framework. :-)

Grab SP1 below:

Monday, 25 February 2008 15:01:22 (AUS Eastern Daylight Time, UTC+11:00)  #    Comments [0] -
.NET Developer | BizTalk | Insights
Kirk Allen Evans recent blog post caught my 'silverlight eye'.


Shows some interesting effects that can be done with Silverlight and importantly has the src code there for you to learn from.

Well done Kirk!

Monday, 25 February 2008 14:35:18 (AUS Eastern Daylight Time, UTC+11:00)  #    Comments [0] -
.NET Developer | General | Other | Tips
# Wednesday, 13 February 2008

Went and caught a session by Chris Johnson around the Studio extensions for Sharepoint, an area that needs to catch up with the pace of the adoption of the product :-)

This Visual Studio Integrated toolset goes a long long way in bridging the gap and streamlining development in Sharepoint.
Features, manifest explorer etc etc.

One of the biggest features in this release is the ability to control the Sharepoint Solution Deployment process for e.g. Chris originally created a solution with 3 Features in it, where he drag' dropped the 3 Feature elements into 1 Feature.


Wednesday, 13 February 2008 12:51:50 (AUS Eastern Daylight Time, UTC+11:00)  #    Comments [0] -
.NET Developer | MOSS
# Friday, 01 February 2008
I've been meaning to post this stuff for a while and now finally gotten round to it :)

Venkatesh (RFID lead architect) before Christmas ran through a first teach of our RFID Course in Singpore + China.

One piece of feedback that came out of the course was that the existing MS RFID DLP Provider gave a few problems with quick consecutive tag reads - and would cause the Provider to stop. (and the other important thing is that the DLP Reader - didn't beep when reading a tag)

So whether Venkatesh could sleep one night or not, he's come up with a new and improved DLP Provider that relies on a COM interface.

Grab the updated files here.... (59.79 KB) (69.19 KB) (13.81 KB) - BizTalk RFID LogFile Viewer - can't leave home without this one :-)


Friday, 01 February 2008 16:24:41 (AUS Eastern Daylight Time, UTC+11:00)  #    Comments [1] -
.NET Developer | BizTalk | RFID
# Friday, 07 December 2007

Rahul (MS BTS TS in our parts - great guy and is hugely knowledgeable in both Java integration technologies as well as MS) has been hard at work again weaving his magic... I just couldn't go past his post without sharing it with you guys.

Great step by step instructions Rahul!

Well done.

Friday, 07 December 2007 21:14:28 (AUS Eastern Daylight Time, UTC+11:00)  #    Comments [0] -
.NET Developer | BizTalk | Insights | Tips

Finally the word on the street is out with Volta finally being announced (cool name).

What is it? What can it do for me? (lately :-)

Here's an example scenario:
- you write a classic .NET Winform/Client App.
- put your 'Volta' hat on and nominate sections, routines etc. of your app and which tier/layer you would like the components/classes/sections to run on.
- You then nominate Web Layers or classic CLR client layers etc.
- Volta crunches your design and boom!!! You've got your SENSATIONAL multitier app from your original single whole app.

In fact - check out this great Walkthrough for the 'Hello World app'

You don't need to worry about app splitting yourself the Volta 'directives' do the work.

When I was at Uni this sort of thing was in an area of my studies (simplified and more specific though - nominating code sections to run concurrently across many distributed CPUs....yeah I know - I'll get back to some English).

Where is this going?
Did I tell you about the next version of BizTalk codenamed 'Oslo'....

My take is that this is (and this is purely just me kicking some tyres with you guys) that BizTalk vNext is all about Modelling. Having a central repository that holds all forms of 'models' that describes not only the process, design, test....but Volta is a preview on the 'deploy' aspect of these Models.

The important point in BTS vNext is that *it is the Model that is executed* not some result of a process that you've run a week ago on that model, otherwise these models get out of date quite quickly.

Here's the 'official Volta blurb'-

On Wednesday, December 5th, Live Labs will announce Volta, an experimental developer toolset that enables developers to build multi-tier web applications by applying the familiar techniques and patterns of developing .NET applications.  In effect, Volta extends the .NET platform to further enable the development of software+services applications, using existing and familiar tools and techniques.  Similar to other technology previews from Live Labs, the purpose of releasing Volta as an experiment, allows for testing of the model with customers and partners in order to gather early feedback and continually influence the direction of Live Labs technologies and concepts.  In addition, where and how Volta will fit into a product roadmap is not the end goal, but rather to experiment with new alternative models to enable Microsoft to continue to be innovative in this new generation of software+services.

Volta Key Messages:

  • Volta is an experimental developer toolset that enables developers to build multi-tier web applications by applying the familiar techniques and patterns from the development of .NET applications.
  • Developers can use C#, VB, or other .NET languages utilizing the familiar .NET libraries and tools.
  • Volta offers a best effort experience in multiple environments without requiring tailoring of the application.
  • Volta furthers Microsoft's software+services efforts by making it easier to write and build multi-tier applications.
  • Volta automates certain low-level aspects of distributing applications across multiple tiers, allowing programmers to devote their creative energy to the distinguishing features of their applications.
  • Via declarative tier splitting, Volta lets developers postpone irreversible design decisions until the last responsible moment, making it faster and cheaper to change the architecture to accommodate evolving needs.
  • Through MSIL rewriting, Volta follows developer's declarations to turn a single-tiered application into a multi-tiered application, generating boilerplate code for communication and serialization.
  • Volta, like other technology previews from Microsoft Live Labs, is an example of the rapid innovation of web-centric technologies happening at Microsoft.
  • The purpose of the technology previews, such as Volta, is to test new technologies and product concepts with customers and partners and to gather early feedback to influence the direction of Live Labs projects.
Friday, 07 December 2007 11:48:51 (AUS Eastern Daylight Time, UTC+11:00)  #    Comments [0] -
.NET Developer | BizTalk | General | Tips
# Thursday, 22 November 2007

Live and very available.

One of the biggest benefits I see is the WorkflowHostService class - a class that provides the glue between the WF world and the WCF world....very nice!


Thursday, 22 November 2007 16:40:50 (AUS Eastern Daylight Time, UTC+11:00)  #    Comments [0] -
.NET Developer | BizTalk | General
# Wednesday, 14 November 2007

Move over Thierry Henry(shame he's gone to Barca :), Kylie and U2..... make room for .NET 3.5 up on your wall.

The folks at MS have been super busy, while talking about what will be in .NET 4+ they release the posters.

Stay tuned for more!

Grab the .NET 3.5 Common Types and Classes


Wednesday, 14 November 2007 15:03:55 (AUS Eastern Daylight Time, UTC+11:00)  #    Comments [0] -
.NET Developer | BizTalk | General | MOSS | TechTalk | Tips
# Thursday, 01 November 2007

A friend of mine Sezai spent a great deal of time putting together a great list of WSS/MOSS articles. Thought I'd share them with you - thanks Sezai!!!

SharePoint Products and Technologies 2007 Customization Policy

Deploying and Supporting Enterprise Search with MOSS 2007 at Microsoft - White Paper

2007 Office System Document: Bringing Web 2.0 to the Enterprise with the 2007 Office System

Microsoft SharePoint Products and Technologies Document: Using Blogs and Wikis in Business

Microsoft SharePoint Products and Technologies Document: Microsoft Office Programs and SharePoint Products and Technologies Integration – Fair, Good, Better, Best

2007 Office System Document: Understanding Workflow in Microsoft Windows SharePoint Services and the 2007 Microsoft Office System


Microsoft SharePoint Products and Technologies Document: Transform Your Business With SharePoint Products and Technologies


Microsoft Best Practices Analyzer for Windows SharePoint Services 3.0 and the 2007 Microsoft Office System


Information architecture in Office SharePoint Server (case study for Fabrikam Industries)


Windows SharePoint Services Document: Application Templates for Windows SharePoint Services 3.0 – Under the Hood


SharePoint Server 2007 Document: Role-Based Templates for SharePoint My Sites — Under the Hood


Plan for building multilingual solutions by using SharePoint Products and Technologies


Planning and architecture for Office SharePoint Server 2007, part 1


Planning and architecture for Office SharePoint Server 2007, part 2


SharePoint Server 2007 Sample: Creating a Custom User Site Provisioning Solution with Office SharePoint Server 2007


SharePoint Products and Technologies Templates: Web Part Templates for Visual Studio .NET


Best Practices: Writing SQL Syntax Queries for Relevant Results in Enterprise Search


Managing enterprise metadata with content types


Evaluating and Customizing Search Relevance in SharePoint Server 2007


Chapter 3: Customizing and Extending the Microsoft Office SharePoint 2007 Search (Part 1 of 2)


Chapter 3: Customizing and Extending the Microsoft Office SharePoint 2007 Search (Part 2 of 2)


Best Practices: Common Coding Issues When Using the SharePoint Object Model


Upgrading an MCMS 2002 Application to SharePoint Server 2007 (Part 1 of 2)


Upgrading an MCMS 2002 Application to SharePoint Server 2007 (Part 2 of 2)


Development Tools and Techniques for Working with Code in Windows SharePoint Services 3.0 (Part 1 of 2)


Development Tools and Techniques for Working with Code in Windows SharePoint Services 3.0 (Part 2 of 2)


Using the Business Data Catalog and Smart Tags with the 2007 Microsoft Office System


Sample governance plan


“Creating Shared Hosting Solutions on Windows SharePoint Services 3.0” whitepaper


Windows SharePoint Services 3.0 Document: Microsoft Windows SharePoint Services Quick Reference Guide


Windows SharePoint Services 3.0 Sample: Example Master Pages


Windows SharePoint Services 3.0 Document: Tips and Tricks - Using Wikis in Windows SharePoint Services 3.0


Windows SharePoint Services 3.0


Windows SharePoint Services 3.0 Application Template: Application Template Core


Windows SharePoint Services 3.0 Application Templates: All Server Admin Templates


Enterprise Content Management From Microsoft  docx  doc


Microsoft Office SharePoint Server 2007 on HP ProLiant Servers - Performance Summary


Working with Large Lists in Office SharePoint Server 2007


SharePoint Server 2007 SDK: Software Development Kit


White paper: Using database mirroring with Office SharePoint Server


White paper: Chaos no more: Steps for building governance into Microsoft Office SharePoint Server 2007


White paper: Integrating Office SharePoint Server 2007 and SAP


White paper: Configure Office SharePoint Server for SAP iView Web Parts


White paper: Data protection and recovery for Office SharePoint Server in small to medium deployments


White paper: Managing social networking with Microsoft Office SharePoint Server 2007


White paper: Upgrading Large Microsoft Office SharePoint Portal Server 2003 Intranet Portals to Microsoft Office SharePoint Server 2007


White paper: SharePoint Products and Technologies governance checklist guide


2007 Microsoft Office System Business Intelligence Integration


Microsoft SQL Server Reporting Services (SSRS) Installation/Configuration Guide for SharePoint Integration Mode


White paper: Guide for Office SharePoint Server Management Pack


White papers: Role-based My Site template setup guides


White papers: Excel Services step-by-step guides


Sample project plan: Office SharePoint Server 2007 deployment (Office Project 2007 format)

Sample project plan: Office SharePoint Server 2007 deployment (Office Project 2003 format)


Downloadable book: Web Publishing and Planning Guide with Microsoft Office SharePoint Server

Thursday, 01 November 2007 17:30:57 (AUS Eastern Daylight Time, UTC+11:00)  #    Comments [0] -
.NET Developer | General | MOSS | Tips | Training
# Tuesday, 25 September 2007

The good thing is - this should take you days and not weeks or months! Brilliant.....absolutely brilliant.

It's not all about reading and writing tags and watching the tracking me RFID Services is all about what do you do with it next?

From a BizTalk perspective, RFID services is another msmq/wcf endpoint that provides rich tag data.

From here you can then process the tag read through biztalk - and as was the case in my demo, sent out to Sharepoint to be viewed by InfoPath.

One of the most exciting things around this is that we can get BAM involved to see how we're tracking, tag fulfillment, reading, processing - when orders arrive till when they leave the warehouse floor.

I'll be posting the demo bits that my colleague Scott Scovell & I stayed up till 2am on 'Demo Day' (hey - wouldn't be a demo without those nights/days :) - soon.

To get started you really want a physical reader to get cracking with - DLP RFID Reader make a good one for developers, and one of the folks at MS have written a 'provider' (This is the key with RFID Services) to use this within RFID Services.

Grab them both from here -


Tuesday, 25 September 2007 18:51:42 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [0] -
.NET Developer | BizTalk | RFID
# Saturday, 25 August 2007

The other day I came across a great MSDN article that explains how to do this across all the BTS Services from SSO through to the rules engine.

Saturday, 25 August 2007 19:37:48 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [0] -
.NET Developer | BizTalk | General | Tips
# Thursday, 23 August 2007
What a great relase of a new WSS/MOSS SDK (couldn't have come at a better time for me :-).

In this latest release we have some goodies packed in to the SDK - the BDC Tool - which allows you to visually create a BDC Application Config file (.xml) that is based on both OLEDB/ODBC and Web Service data sources.

This was one of the major complaints from business when looking into the BDC (BDCMan did do a great job)

Sharepoint SDK Refresh (175MB!!)


Brief Summary of What's included (taken from the Sharepoint page above):
-ECM Start kit also included
-updated workflow samples
-SSO provider samples (to use SSO to access your own people stores)
-BDC Tool + further BDC samples (SAP, custom etc)
-Document Inspector custom sample
-Document Converter custom sample
...and the list goes on.....

either which way - this release is BIG!


Thursday, 23 August 2007 13:57:59 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [0] -
.NET Developer | General | MOSS | Office
# Monday, 20 August 2007

In the previous beta this tool supported a huge range of different blogs and their respective APIs....except Sharepoint.
(There was a tweak we could do, but essentially you had to turn off NTLM authentication and go with Forms.....generally this wasn't going to happen anytime soon)

Enter - LW Beta 2 - pick up your copy HERE


Monday, 20 August 2007 12:32:54 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [0] -
.NET Developer | General | Other | Tips
# Tuesday, 14 August 2007


Wow what a session!! Being a Level 400 session my expectation (from those who make teched) was to go reasonably deep. I had a fantastic crowd with standing room only in the theatre room - my last session of the day (being my 3rd) I was knackered and ready to go out with a bang.

So I decided to jump into the Workflow Foundation and discuss *what is actually done behind the scenes* with Sharepoint's WF management. This was well received (and I'm sure a few people in the audience were saying 'So I just want to know how to approve something'....we got onto that later) and opened up a few concepts explaining why we do the things we do within our Sharepoint Workflows. e.g. Task Correlation Tokens and new Task IDs, why we need to generate new ones if we handle a task changed event.

I then got onto some of the Sharepoint Workflow Implementations and wanted to highlight the use of a State based workflow as opposed to the usual SequentialWorkflow.

*** DEMO CODE WILL BE POSTED SOON FOLKS *** (don't have my vpc with my to extract out my projects for you right now)
Slide Deck:OFC409_Mick_Badran_Workflow_Deep_Dive.v1.2.pdf (977.98 KB)

Tuesday, 14 August 2007 23:01:42 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [0] -
.NET Developer | Events | TechEd | MOSS
# Monday, 13 August 2007

Having 3 sessions in 1 day at the conference, this was session number 2.
CON309 - Mick_Badran_Advanced BizTalk R2 Concepts 

We had a great session here and all my demos came off again!!! Except for the screen size and the projector this particular 'room' used.
I was presenting at 800x600 - talk about feeling techno chlostrophobic. I feel like I was in quick sand, trying to gasp for air...but we use what we have.

I was hoping to do an RFID demo but 'last minute technical difficulties' forced that one on the back burner - I had more than enough demos for this session.

Thanks to all the folks that attended this - I had fun as I hope you did. This session made the top ten sessions at TechEd! Whooo hooo

The demos went something like:

  1. Publishing and Consuming WCF Services from R2 - published a couple of Orchestrations and consumed the published WCF WS Service from a basic client app.
    I then moved the published IIS WCF WS Service into the BTS Instance host by using a custom WCF Adapter and configuring it accordingly.
    Next I exposed the same service as a Socket Address - all called from the same client with no code recompile. Which is what we want to highlight using WCF Services.

    I then fired up a WCF WF Webservice and consumed it from BizTalk - all pretty simple, but good to highlight.
  2. For the second major demo I created a WF workflow and using the BizTalk Extensions for Workflow, hosted this within BizTalk.

Slide Deck: CON309 - Mick_Badran_Advanced BizTalk R2 Concepts.pdf (595.48 KB)
Demos: BizTalk TechEd2007

Monday, 13 August 2007 02:26:34 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [0] -
.NET Developer | BizTalk | RFID | Events | TechEd
# Wednesday, 01 August 2007

Hey folks, I've been flat chat lately burning the candle at both ends getting ready for my sessions at TechEd/Deep Dives.

In a *spare* moment - I am asked to do a Channel9 GeekSpeak session - talking about Workflow and BizTalk Integration.
(I've had my xbox360 confiscated for 1 month to help me get through this month - that's a story for another time :)

This session is on tomorrow morning - so for those of you feeding babies, can't sleep etc. I'd love to have you along for moral support as this will be my very first session in Geek Speak.

So if you're up for a 5am - 6am start Thursday morning I'll see you there. Put something Aussie in your nick name!

Update: The Results have come in.......well this was done at 4am my time in the morning and all I can say is thanks to the folks that attended for being understanding :)

Customer satisfaction scores are based on a scale from 0 to 9 points.  
The average score for a webcast is 7.8.

Usefulness of Information:
Speaker Presentation Skills:
Effectiveness of Demonstration:
Overall Presentation Rating: 7.8

Link to see the webcast over and over again
 Looks like I got around the average at 4am in the morning.....I'm amazed I even made sense!!!:)

Wednesday, 01 August 2007 21:19:50 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [0] -
.NET Developer | BizTalk | Insights | Events | WinWF
# Friday, 29 June 2007

What a about the pinnacle of TLA's at the height of a great technology field.
Can you imagine being at work/meetings etc.
and say "Hold on, I've got to grab the WCF LOB Adapter SDK for my BTS Messaging Hub"
(at this point I'm sure it would clear the floor if you were at a party and people would be looking at each other thinking that someone hasn't taken their vitamin B12 this morning)

So we really do need to come up with a sexier name than this (when I was 4 my parents read me a great book about a kid called "Tikki-tikki-tembo-no-sarembo" and he fell into the well - you could say I was scared off long names as a kid)

What does this thing do for you? It will change the way you develop adapter for use with/without BTS. Sensational!!!



p.s. you don't necessarily need BizTalk to build adapters with this framework. There are BTS06 R2 'extensions' to this framework - the BTS 'strand' of this SDK is currently called the BizTalk .NET Adapter SDK

There's some very cool things ahead.....stay tuned......

Friday, 29 June 2007 22:16:29 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [0] -
.NET Developer | BizTalk | Insights | Tips
# Wednesday, 23 May 2007

As you know - MOSS is extensible in pretty much all directions. The product team has done a great job on this latest version!!!
There are two elements you need to create to get this to work:

  1. an XML file - tells Sharepoint what characteristics/metadata about your new field.
  2. a .NET class(es) to control the fields behaviour during rendering for view, edit and new modes. (so you can precisely control how the control appears)

My buddy Clayton is up all hours and has nutted out a simple example - check it out here

Wednesday, 23 May 2007 09:35:55 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [0] -
.NET Developer | MOSS
# Tuesday, 08 May 2007

There I was preparing some demos for this upcoming conference and BAM, the old API quirks are lurking.

Basically I wanted to simply set the Title or Name fields on a File in a Document Library. All sounds simple enough.

Here's the code:

It's a pretty simple sample, grab the file you want - in my case I'm just picking the first one.
Line 48/49 - compile and run, but do not set the Title of the file element.
*** Lines 50-52 are the winners.

So even though the in lines 48/49, there is actually a 'Title' property and it's settable, it falls on deaf ears as far as WSS is concerned.

I was hoping the File and its corresponding list item were kept in sync......never assume.


Tuesday, 08 May 2007 15:08:39 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [0] -
.NET Developer | MOSS | Tips
# Tuesday, 24 April 2007

With BizTalk 2006 R2 rounding the corner into the home leg, the folks in the CSD (Connected Systems Division - Biztalk, .NET, WCF, WF) have been very busy!! :)

We're looking down the telescope at BizTalk 'vNext'.

It's current name is 'BizTalk Internet Services' which offers a bunch of services for the 'cloud' (aka internet based wcf services). My personal favourite is the relay service.

Alot of functionality that previously existed in BizTalk has been pushed down into the .NET framework 3.x/4.x (WCF Services etc.)....allowing the BizTalk team to focus on some really cool new BizTalk features.

Here's an email I got earlier from Marjan from MS.


As you may have heard, for the past year the Connected Systems Division has been incubating a set of building-block services that will shape the next generation of application development.  On Tuesday the 24th, we will open the incubation of BizTalk Services to the public by inviting developers from all over the world to use the services and provide feedback. 

BizTalk Services will be available via  (UPDATE: OPENS 24th Tues - US TIME)

What are BizTalk Services?

These services, which have been in internal incubation for the past year, represent hosted versions of some technologies developed in the Connected Systems Division.  Included in this set of services are:

o Message routing – think of this as firewall friendly B2B messaging (Available now)

o Simple publish/subscribe event brokering – Pub/Sub at Internet scale (Coming soon)

o Simple federated identity and access control  (Available now)

o Workflow processes – Simple templates for cross-organization integration and the orchestration of business processes interacting with multiple services (Coming soon)

(Just had to stick 'The cloud' image in :)


Tuesday, 24 April 2007 07:31:49 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [0] -
.NET Developer | BizTalk | Insights
# Sunday, 28 January 2007

While struggling to get all the links together and the correct version of all the accessories to .NET 3.0 RC1 (as this is what BTS requires).

I wanted to develop Workflow based solutions as well and noticed my friend Paul Andrew has read my mind in his post HERE

Here's a snip:

I have a Windows Vista RC machine and a Windows XP machine which I've installed these on. It's also the same install on Windows Server 2003 if you use that as your development environment.

Sunday, 28 January 2007 21:23:27 (AUS Eastern Daylight Time, UTC+11:00)  #    Comments [2] -
.NET Developer | BizTalk | Tips | WinWF
# Tuesday, 23 January 2007

While working on a BTS 2006 solution - I decided to use the SQL Adapter to call a stored Proc to update data.

While the SQL Adapter wizard is OK, there's no real reason to use it. I usually delete everything it creates, apart from the Schema for namespace samples. The Namespaces you specify through the wizard is there mainly for the SQL Adapter to figure out where the bits are for it to process, and where return results should be inserted into......

In alot of solutions I build, I usually have a single generic SQL Update Orchestration, not an Orch, Schema + Port for each type of SQL action required.

The trick to all this is how the SQL Adapter handles the messages sent to it. More details is found in the SQLXML documentation.

The paper back version:

Let's say I have two tables and a Stored Proc that I want to use within the SAME DB (if I want to talk to different DB's then we'd need to create a separate message for the different DB's to update due to the fact that the physical SQL Port (whether it be 'dynamic' or physical) e.g. SQL://ServerName/DB.......

Table A: PacMan Players
Fields -
Name, email

Table B: PacMan Scores
Fields - email, score

Stored Proc: UpdateScores
email, score, gametime

If these three were in the same DB here's the message(s) that you'd need to send to the SQL Adapter (could even be via CBR and not ALWAYS an Orch).

e.g. a sample message for stored procs.
<sqlRequest xmlns='http://micksdemos.sql'>
            <UpdateScores email='' score='54' gametime='1200' />
            <AnotherStoredProc p1='2' p2='aaa' p3='....' />
                     <!-- **** Set to be ANY element here, with 'skip' processing set via the schema **** -->

e.g. a sample message for tables (further details on this message structure can be obtained from SQLXML Documents)
<sqlRequest xmlns='http://micksdemos.sql'>   
                  <PacManScores email='' score='22000' />
                  <PacManPlayers Name='mick' email=''  />

Now the interesting thing upon the results being returned for the called Stored Procs
We sent down batches of 400 updates to be performed via the stored proc method, and the results were supprising!!!!

We got a message back via one of the several Two-Way SQL Ports defined (each talking to a different database, being activated via CBR)

The return results was a Multi-part message with 400 parts!!!!!! In this case I was waiting for the return message within an Orchestration and then carrying on (mainly for BAM purposes to capture timings, average call times etc)

Do you know how hard it was to find an appropriate message type?????? If I made a multi-part message type with 5 parts it's not 400. If I made one with 400 parts (each part was a type of ANY) then I'm sure we'd have a batch in the future with 401 updates...boom! blows up.

So my challenge was to find the appropriate message type for this return message.....needless to say "I'm still looking"
I tried
(1) XLANGMessage - not serializable and bts wont compile in the IDE. This is the most logical cause then I could just go through the parts grabbing each result message.
(2) XLANGPart - long shot, individual part of a message, but also if a Message if declared as ANY type then this is the .NET Message Type that represents it behind the scenes.
(3) ANY - Compiled and run, error when the results message is returned, as the ANY type is still dealing with a single part message
(4) XMLDocument - yeah right! Sort of the one that you cover your eyes, run the test and peep through your fingers looking at the screen to see if it worked....or more like *hoped' it worked :)

Solution: Create a simple Custom Pipeline Component to Consolidate the Return parts
The Orchestration is fine to go on continuing processing.
The thing that stumped me is that I send in a Batch within a Single XML Document, why dont I get that as a response??

I could imagine when sending a single update this problem never occurs. (and it hasnt in the past)

Here's the custom pipeline component - this one's in VB.NET as per the client's coding standards on this.
(I use the VirtualStream found in the SDK)
- this is not production ready code. Further stress testing needed.

Here's a snippet showing the execute method (BTSHelper.VirtualStream - is the VirtualStream class from the BTS 2006 SDK)

#Region "IComponent Members"
Public Function Execute(ByVal pContext As IPipelineContext, ByVal pInMsg As IBaseMessage) As _
IBaseMessage Implements IComponent.Execute

                   Dim msgReturn As IBaseMessage = InternalMyExecute(pContext, pInMsg)
                   Return (msgReturn)
Catch ex As Exception
                   Throw ex
End Try         

End Function

Private Function InternalMyExecute(ByVal pc As IPipelineContext, ByVal inMsg As IBaseMessage) As IBaseMessage
         Dim outMsg As IBaseMessage = Nothing
         Dim outPt As IBaseMessagePart = Nothing
         Dim outStream As BTSHelper.VirtualStream = Nothing
         Dim sw As StreamWriter = Nothing
               If (inMsg.PartCount > 1) Then 'combine all the parts into one - painful return results from SQL.
                          outMsg = pc.GetMessageFactory().CreateMessage()
                          outMsg.Context = inMsg.Context
                          outPt = pc.GetMessageFactory().CreateMessagePart()
                          outStream = New BTSHelper.VirtualStream()
                          sw = New StreamWriter(outStream)
                          sw.Write("<{0}>", _documentRootElement)
                          For i As Integer = 0 To inMsg.PartCount - 1
                                        Dim sptName As String = String.Empty
                                        Dim s As String = GetMessagePartAsString(inMsg.GetPartByIndex(i, sptName))
                          sw.Write("</{0}>", _documentRootElement)
                          ' we DONT want to close the stream i.e. sw.close()
                          outStream.Seek(0, SeekOrigin.Begin)
                          outPt.Data = outStream
                          outMsg.AddPart("Body", outPt, True)
                          Return (outMsg)
                  Else 'single part
                         Return (inMsg)
                  End If
          Catch ex As Exception
                  inMsg.SetErrorInfo(ex) ' the inMessage is the one that gets reported on in BizTalk within the pipeline
                  EventLog.WriteEntry(_EVENTLOG_SOURCE, "SQL Combiner Exception Internal Execute- "          
                          + ControlChars.CrLf + ControlChars.CrLf + ex.Message, EventLogEntryType.Error)
                  Throw ex

          End Try
End Function

Private Function GetMessagePartAsString(ByVal pt As IBaseMessagePart) As String
                 Dim xdoc As XmlDocument = Nothing
                       Return (xdoc.DocumentElement.OuterXml)
           Catch ex As Exception
                       Throw ex
                       xdoc = Nothing
           End Try
End Function

Public Sub CopyStream(ByVal src As Stream, ByVal dst As Stream)
                  If (src.CanSeek) Then
                            src.Seek(0, SeekOrigin.Begin)
                  End If
                  Dim DATA_BLOCK As Integer = 4096
                  Dim bytesRead As Integer = 0
                  Dim buff(DATA_BLOCK - 1) As Byte

                  bytesRead = src.Read(buff, 0, DATA_BLOCK)
                  While (bytesRead > 0)
                           dst.Write(buff, 0, bytesRead)
                           bytesRead = src.Read(buff, 0, DATA_BLOCK)
                  End While
            Catch ex As Exception
                  Throw ex
            End Try
End Sub

Private Sub CopyMessageParts(ByVal sourceMessage As IBaseMessage, ByVal destinationMessage As IBaseMessage, ByVal newBodyPart As IBaseMessagePart)

                 Dim bodyPartName As String = sourceMessage.BodyPartName

                 For i As Integer = 0 To sourceMessage.PartCount - 1
                              Dim partName As String = Nothing
                              Dim messagePart As IBaseMessagePart = sourceMessage.GetPartByIndex(i, partName)
                              If (partName <> bodyPartName) Then
                                            destinationMessage.AddPart(partName, messagePart, False)
                                            destinationMessage.AddPart(bodyPartName, newBodyPart, True)
                              End If
End Sub
#End Region

Grab the code from below - This sample is aimed to be something to look and discover from rather than be a 'ready made installable package' (9.63 KB)
Tuesday, 23 January 2007 23:57:07 (AUS Eastern Daylight Time, UTC+11:00)  #    Comments [2] -
.NET Developer | BizTalk | Tips
# Wednesday, 20 December 2006
I was coming across some issues with my WCF Service Clients and not shutting down properly.
They were throwing exceptions for various reasons and while trawling the ether I came across a great helper class (and this is where I saw the c# where clause) from Erwyn van der Meer

The problem centers around calling proxy.Abort(); or proxy.Close(); at different stages in the client proxies lifecycle.

Microsoft explain why we have arrived where we have on this - great candid discussion from the internal crew.

He discusses the problem and provides a great WCF client proxy helper class.

Here's a snippet from the Microsoft Discussion

Why does ClientBase Dispose need to throw on faulted state? (Or, what's the difference between close and abort?)

ICommunicationObject (from which ServiceHost, ClientBase, IChannel, IChannelFactory, and IChannelListener ultimately derive) has always had two methods for shutting down the object: (a) Close, and (b) Abort.  The semantics are that if you want to shutdown gracefully, call Close otherwise to shutdown ungracefully you call Abort. 


As a consequence, Close() takes a Timeout and has an async version (since it can block), and also Close() can throw Exceptions. Documented Exceptions out of Close are CommunicationException (of which CommunicationObjectFaultedException is a subclass), and TimeoutException.


Abort() conversely is not supposed to block (or throw any expected exceptions), and therefore doesn’t have a timeout or an async version.


These two concepts have held from the inception of Indigo through today. So far, so good.


In its original incarnation, ICommunicationObject : IDisposable.  As a marker interface, we thought it would be useful to notify users that the should eagerly release this object if possible. This is where the problems begin. 


Until Beta 1, we had Dispose() == Abort().  Part of the reasoning was that Dispose() should do the minimum necessary to clean up.  This was possibly our #1 complaint in Beta 1. Users would put their channel in a using() block, and any cached messages waiting to be flushed would get dropped on the floor. Transactions wouldn’t get committed, sessions would get ACKed, etc.


Because of this feedback, in Beta 2 we changed our behavior to have Dispose() ~= Close(). We knew that throwing causes issues (some of which are noted on this thread), so we made Dispose try to be “smart”. That is, if we were not in the Opened state, we would under the covers call Abort(). This has its own set of issues, the topmost being that you can’t reason about the system from a reliability perspective. Dispose can still throw, but it won’t _always_ notify you that something went wrong.  Ultimately we made the decision that we needed to remove IDisposable from ICommunicationObject.  After much debate, IDisposable was left on ServiceHost and ClientBase, the theory being that for many users, it’s ok if Dispose throws, they still prefer the convenience of using(), and the marker that it should be eagerly cleaned up.  You can argue (and some of us did) that we should have removed it from those two classes as well, but for good or for ill we have landed where we have. It’s an area where you will never get full agreement, so we need to espouse best practices in our SDK samples, which is the try{Close}/catch{Abort} paradigm.


Brian McNamara [MSFT]

Wednesday, 20 December 2006 01:23:20 (AUS Eastern Daylight Time, UTC+11:00)  #    Comments [0] -
.NET Developer | BizTalk
Came across some interesting code the other day and I must admit I hadn't seen the Where clause used like this before:
using System;

class MyClassy<T, U>
where T : class
where U : struct
And according to the Microsoft Definition found here

where (C# Reference) 

The where clause is used to specify constraints on the types that can be used as arguments for a type parameter defined in a generic declaration. For example, you can declare a generic class, MyGenericClass, such that the type parameter T implements the IComparable<T> interface:

public class MyGenericClass<T> where T:IComparable { }

Wednesday, 20 December 2006 01:10:17 (AUS Eastern Daylight Time, UTC+11:00)  #    Comments [0] -
.NET Developer
# Sunday, 03 December 2006

Lots of cool things about R2 in the pipeline and one of them is BizTalk's deep integration with WCF.

What does that mean for you? If you havent already, start looking at WCF as going forward I believe that the majority of BizTalk solutions will be incorporating WCF in a big way.

I'm currently writing a course for R2/WCF/WF stuff and I thought I'd jot down a table so WCF things stick in my mind.

WCF Endpoint = Address + Contract + Binding
WCF Binding = Transport + Message Encoding
WCF Contract = Message/Data details
WCF Address = <moniker>://<server>:<Port>/<endpoint URI>

Binding Types

  • BasicHTTPBinding - Maximum interoprability through conformity to the WS-Basic Profile 1.1
  • WSHttpBinding - HTTP communication in conformity to the WS-* procotcols.
  • WSDualHttpBinding - Duplex HTTP communication, by which the receiver of an initial message will not reply directly to the initial sender, but may transmit any number of responses via HTTP in conformity to WS-* protocols.
  • WSFederationBinding - HTTP communication, in which access to the resources of a service can be controlled based on credentials issued by an explicitly-identified credential provider.
  • NetTCPBinding - Secure, reliable, high-performance communication between WCF software components across the network.
  • NetNamedPipeBinding - Secure, reliable, high-performance communication between WCF s/w components on the same machine.
  • NetMSMQBinding - WCF communicating over MSMQ
  • MsmqIntegrationBinding - WCF entities communicating via MSMQ
  • NetPeerTcpBinding - WCF entities communicating via Windows Peer-To-Peer services

(ones that are missing at the moment - 'Interprocess Communication Binding' IPC, was in the early betas.....and SQL Server Service Broker)

Sunday, 03 December 2006 23:29:30 (AUS Eastern Daylight Time, UTC+11:00)  #    Comments [3] -
.NET Developer | BizTalk
# Thursday, 09 November 2006
finally some of the pain of the different products and their versions have gone away. :-)
Thursday, 09 November 2006 10:54:59 (AUS Eastern Daylight Time, UTC+11:00)  #    Comments [0] -
.NET Developer | WinWF
# Sunday, 22 October 2006

Hi folks - as you all know it's about Connected Systems - not neccessarily about one technology on it's own.

I'm a firm believer that we're always trying to solve a customer's problem/solution which will involve more than just BizTalk.

In our 'BizTalk' space now (with R2 TAP on the way), we have technologies such as:

  1. BizTalk 2006
  2. RFID
  3. WCF
  4. WinWF
  5. SSB
  6. SSIS
  7. All the LOB adapters from BizTalk 2006
  8. MOSS 2007
  9. MSMQ/MQSeries etc.

So as an 'integration specialist' we need to know not only how these work and the benefits of each for certain environments, but also how to create an effective solution in these technologies. (not something like - "I believe you can do that in .....I just need to watch some webcasts on it first" :)

The Sydney BizTalk User Group has launched a Connected Systems Mailing list.

How to JOIN:
1. send an email to with
in the BODY of the message (you can put anything for the SUBJECT, or leave it blank)

So come and join my one other friend to kick this off. :)

How to UNJOIN:
1. send an email to with


Sunday, 22 October 2006 15:03:28 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [0] -
.NET Developer | BizTalk | RFID | Events | Office | Tips | WinWF
# Friday, 06 October 2006

Just got an email from Andrew (great all round good guy) about a cool Adapter example he came across.....very nice.

Enabling Faxing of messages from BTS using the Win2K3 FaxServices API and the Office2003 Document Imaging Library.

Enough said - BizTalk Fax Adapter Project

Nice work!

------------- snippet from the Project Page --------------
What the BTS Fax Adapter Does

When the FaxMessage Arrives to the Incomming Archive. The Fax Adapter Copies the Tiff Image (FaxMessage) to the temporary folder and runs OCR on the Tiff Image and Extracts the Text and submits to BizTalk as a message, or takes messages from BizTalk Server and Sends to the FaxConsole. It provides code to build either a dynamic or a static adapter; however, the following procedure only outlines the static adapter. A static adapter is an adapter with a static set of schemas and no custom user interface. A dynamic adapter has a custom user interface and potentially a dynamic set of schemas. Both static and dynamic adapters use the Add Adapter Wizard to add their schemas to a BizTalk project

Friday, 06 October 2006 12:04:00 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [0] -
.NET Developer | BizTalk
# Wednesday, 09 August 2006
While on a current project, I needed to go off to a WebServer and grab a page, parse it to make sure all was alive and kicking (a system verification page).

Connected and ended up getting -

The adapter failed to transmit message going to send port "http://<server>/<site>/". It will be retransmitted after the retry interval specified for this Send Port. Details:"The remote server returned an error: (505) Http Version Not Supported.".


The weird thing is, that when BTS retries (in 5 mins) - it works!!! No good for production!

After doing some digging, the problem lies around the fact that .NET 1.1 (in my case) clients use HTTP 1.1 by default.
There is a HTTP Header "Expect: 100-Continue" that is inserted into the outgoing webrequests.

The JBOSS WebServer grumbles about this - more info.

Now from BTS we can get access to the HTTP/SOAP Headers Collection - msg(HTTP.InputHeaders)=....

Add this setting to the BizTalk Config file (BTSNTSvc.exe.config)

            <servicePointManager checkCertificateName="true" checkCertificateRevocationList="false" expect100Continue="false" />

NOTE: this will set all outgoing HTTP requests (including SOAP),
to pick and choose which addresses this applies to..... is to most probably write
a webRequestModules for the Addresses. This is a bit of code but allows you to either turn this setting on/off or pass the original request through
to the default HTTPWebRequestModule (check out machine.config) for more.

This is by far the most elegant - there are other ways to do this, they all seem offensive!



Wednesday, 09 August 2006 11:40:02 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [1] -
.NET Developer | BizTalk
# Friday, 28 July 2006

Here's a list that I've been putting together through the whole beta/rc cycle for the various bits for
Microsoft Office Sharepoint Server(or Windows Sharepoint Services V3)

Note: Individual components have progressed in their own realm but until the next release of
Office 2007 BXXX (or RC 1), then these are the compatible respective components.

In the end - all the products and components will be aligned with each other's current version.
Until then we have to crawl along a little at a time. Try doing 14 months of this.....

Time to get cracking......:)

WinFX Runtime Beta 2

Contains APIs for working with Office Open XML file format.

Windows SDK for WinFX Beta 2

Microsoft Visual Studio Code Name “Orcas” Community Technology Preview – Development Tools for WinFX

Windows Workflow Foundation Beta 2.2

Windows Workflow Foundation Runtime.

Visual Studio 2005 Extensions for Workflow Foundation Beta 2.2

Tools to enable you to develop workflows.  Note you also need “2007 Office System Starter Kit: ECM (Beta 2)” in order to develop workflows for WSS/MOSS 2007.  You need to run Visual Studio on the SharePoint box itself.

MOSS 2007 Beta 2 SDK

WSS v3 Beta 2 SDK

MOSS 2007 Beta 2 Developer Samples

2007 Office System Starter Kit: ECM (Beta 2)

This Starter Kit for 2007 Office System (Beta 2) contains ECM feature extension code samples, supplemental developer white papers, and Visual Studio project templates for workflow in Office SharePoint Server 2007 (Beta 2).

VSTO 'v3' June CTP for Office 2007 Beta 2

Preview of the Visual Studio Tools for Office ‘v3’ that will ship with ‘Orcas’

Friday, 28 July 2006 10:36:39 (AUS Eastern Standard Time, UTC+10:00)  #    Comments [3] -
.NET Developer | Office
<2017 March>
 AppFabric CAT
AppFabric Windows Server Customer Advisory Team - New Blog.
[Feed] BizTalk 2006 - Windows SharePoint Services adapter
BizTalk 2006 Sharepoint adapter!!
 Breeze SharePoint 2010 Bootcamp
Breeze SharePoint 2010 Bootcamp
[Feed] BTS 2006 R2/EDI
[Feed] Chris Vidotto (MS BTS Legend)
Needs no intro....
 Mark Daunt
About the author/Disclaimer

The opinions expressed herein are my own personal opinions and do not represent my employer's view in any way.

© Copyright 2017
Sign In
Total Posts: 608
This Year: 0
This Month: 0
This Week: 0
Comments: 270
All Content © 2017, Breeze