...and everything in between RSS 2.0
# Tuesday, January 25, 2011

I am finding that the development storage emulator has a few “undocumented features”. A few days ago, I was happily working through the Windows Azure Training Kit and things were going well. Today I was putting together a PoC using pieces learnt from the labs. I kept hitting a problem when trying to insert an entity into the newly created table storage (running on the local Storage Emulator). I was getting the generic error message when querying the collection:

“one of the request inputs is not valid”

var match = (from c in this.context.Clients
              where c.Name == name
              select c).FirstOrDefault();


Some things I tried that didn’t help:

  • Restarting the Storage Emulator a few times.
  • Restarting the machine (always worth a shot!)
  • Deleting the entries in the TableContainer and TableRow tables in the development storage DB.
  • Recreating the development storage DB using DSINIT /forceCreate.
  • Running around the office naked.

After hunting around for quite some time (including running the lab code again and getting them same result) I tracked it down to the table storage schema not being created after issuing


Note: This worked happily when I was working through the labs a few days ago. Neither my code nor the lab code was working now. Annoyed

Looking at the underlying DB (using SQLEXPRESS on my VM) I found no schema populated


After some frustrating searching, I came across this post that suggested an ugly work around > azure-table-storage-what-a-pain-in-the-ass. It suggests that on the local storage emulator you need to “convince” the table service provider that you know what you are doing by inserting some dummy entities. This only appears to be needed when you have no data in your tables. So I added the following code in my data source constructor so it gets called by my service before performing any CRUD operations. 

// [WORK AROUND] See http://deeperdesign.wordpress.com/2010/03/10/azure-table-storage-what-a-pain-in-the-ass/
//  Generate some inserts to populate empty table
var client = new Client("dummy", "dummy");
this.context.AddObject("Clients", client);
var post = new Post("dummy", "dummy");
this.context.AddObject("Posts", post);

Ugly but it jumps the hurdle and allows me to get back to building out the rest of the solution. Just remember to comment it back out after you verify the schema xml has been populated successfully and your CRUD operations are going through.

Tuesday, January 25, 2011 1:53:33 PM (AUS Eastern Daylight Time, UTC+11:00)  #    - Trackback
Windows Azure
# Tuesday, November 30, 2010
HTTP Error 401.1 - Unauthorized: Access is denied due to invalid credentials.

"If I had a dollar for every time I’ve seen this…”

And yet the solution appears to be different each time. Or at least to me when it comes to issues with integrated Windows Authentication and Kerberos. Today the solution lay in forcing IIS to use NTLM authentication as suggested by the following KB article


To work around this behaviour if you have multiple application pools that run under different domain user accounts, you must force IIS to use NTLM as your authentication mechanism if you want to use Integrated Windows authentication only. To do this, follow these steps on the server that is running IIS:

  1. Start a command prompt.
  2. Locate and then change to the directory that contains the Adsutil.vbs file. By default, this directory is C:\Inetpub\Adminscripts.
  3. Type the following command, and then press ENTER:

    cscript adsutil.vbs set w3svc/NTAuthenticationProviders "NTLM"

  4. To verify that the NtAuthenticationProviders metabase property is set to NTLM, type the following command, and then press ENTER:

    cscript adsutil.vbs get w3svc/NTAuthenticationProviders

    The following text should be returned:

    NTAuthenticationProviders       : (STRING) "NTLM"
Tuesday, November 30, 2010 8:49:29 PM (AUS Eastern Daylight Time, UTC+11:00)  #    - Trackback
BizTalk General
# Monday, October 18, 2010

On a recent project I needed to resolve the identity of clients calling an orchestration exposed as a WCF service. Clients would use a X.509 certificate to sign the message. Configuring the WCF service was easy enough but I was not getting the party resolution piece working correctly. The WCF adapter (I was using WCF-CustomIsolated) was not populating the context property (BTS.SignatureCertificate) that the party resolution component uses to lookup the party even though the client certificate was being validated. The WCF adapter was dumping the soap headers into the context. I was left either to parse the headers manually and find a way to grab details of the signing certificate or somehow get the WCF adapter to do this work for me (as it was already validating the client certificate and checking we had the corresponding public key in the certificate store). Fortunately, I found a way we can get the adapter to help out.

The solution was to create a WCF service behavior extension to intercept message processing by the adapter (note this takes place before the message is presented to the receive pipeline). The custom behavior looks for a client certificate and if found writes the thumbprint into a custom soap header. The WCF Adapter would then write my custom header into the message context and I could grab it in a custom pipeline component. I chose to write a component to execute before the OOTB party resolution component and populate the BTS.SignatureCertificate context property with the value of certificate thumbprint. I could of done this all in one component and performed custom party resolution but thought this might be a bit cleaner.

So looking at the WCF service behavior

   1:  using System;
   2:  using System.ServiceModel;
   3:  using System.ServiceModel.Dispatcher;
   4:  using System.ServiceModel.Channels;
   5:  using System.ServiceModel.Description;
   6:  using System.IdentityModel.Claims;
   7:  using System.ServiceModel.Configuration;
   9:  namespace Breeze.WCF.ClientCertificateContext
  10:  {
  11:      public class MessageInspector : IDispatchMessageInspector, IServiceBehavior
  12:      {
  13:          #region IDispatchMessageInspector Members
  15:          object IDispatchMessageInspector.AfterReceiveRequest(ref Message request, IClientChannel channel, InstanceContext instanceContext)
  16:          {
  17:              object correlationState = null;
  18:              string thumbprint = "";
  20:              try
  21:              {
  22:                  // Gather thumbprint of signing certificate used by the client
  23:                  foreach (ClaimSet set in request.Properties.Security.ServiceSecurityContext.AuthorizationContext.ClaimSets)
  24:                  {
  25:                      foreach (Claim claim in set.FindClaims(ClaimTypes.Thumbprint, Rights.Identity))
  26:                      {
  27:                          thumbprint = BitConverter.ToString((byte[])claim.Resource);
  28:                          thumbprint = thumbprint.Replace("-", "");
  29:                      }
  30:                  }
  32:                  // Write this away as a custom message header
  33:                  if (!String.IsNullOrEmpty(thumbprint))
  34:                  {
  35:                      MessageHeader header = MessageHeader.CreateHeader("ClientCertificate", "http://schemas.breeze.net/BizTalk/WCF-properties", thumbprint);
  36:                      request.Headers.Add(header);
  37:                  }
  39:              }
  40:              catch (Exception ex)
  41:              {
  42:                  System.Diagnostics.EventLog.WriteEntry("WCF MessageInspector", String.Format("Exception caught: {0}", ex.ToString()));
  43:              }
  45:              return correlationState;
  46:          }
  48:          void IDispatchMessageInspector.BeforeSendReply(ref Message reply, object correlationState)
  49:          {
  50:          }
  52:          #endregion
  54:          #region IServiceBehavior Members
  56:          public void AddBindingParameters(ServiceDescription serviceDescription, ServiceHostBase serviceHostBase, 
                                                System.Collections.ObjectModel.Collection<ServiceEndpoint> endpoints, BindingParameterCollection bindingParameters)
  57:          {
  58:              return;
  59:          }
  61:          public void ApplyDispatchBehavior(ServiceDescription serviceDescription, ServiceHostBase serviceHostBase)
  62:          {
  63:              foreach (ChannelDispatcher channelDispatcher in serviceHostBase.ChannelDispatchers)
  64:              {
  65:                  foreach (EndpointDispatcher endpointDispatcher in channelDispatcher.Endpoints)
  66:                  {
  67:                      endpointDispatcher.DispatchRuntime.MessageInspectors.Add(this);
  68:                  }
  69:              }
  70:          }
  72:          public void Validate(ServiceDescription serviceDescription, ServiceHostBase serviceHostBase)
  73:          {
  74:              return;
  75:          }
  77:          #endregion
  78:      }
  80:      public class MessageInspectorElement : BehaviorExtensionElement
  81:      {
  82:          public override Type BehaviorType
  83:          {
  84:              get { return typeof(MessageInspector); }
  85:          }
  87:          protected override object CreateBehavior()
  88:          {
  89:              return new MessageInspector();
  90:          }
  91:      }
  94:  }


Tip: Don't forget to implement the BehaviorExtensionElement. You’ll need this to apply the service behavior via configuration (in the receive location) rather then having to do it programmatically. You will also need to sign, GAC and register the service behavior extension element in the machine.config (or service’s web.config in IIS)

With the WCF service behavior bits done, we need to add it to our receive location:


If you were to test the solution now, you’ll get the thumbprint of the client certificate written to your custom context property (http://schemas.breeze.net/BizTalk/WCF-properties#ClientCertificate) and will look something like this:

   1:  <ClientCertificate xmlns="http://schemas.breeze.net/BizTalk/WCF-properties">11C3E164C41ADC8DBA0EA6558784B9FAE19E398D</ClientCertificate>


I had thought I might be able to get away with writing this directly into the BTS.SignatureCertificate context property but the format is clearly different. The BTS.SignatureCertificate property needs just the certificate thumbprint string and obviously we have the xml wrapper. So we must create a simple pipeline component to sit somewhere before the party resolution component to grab the certificate thumbprint out of our custom context property and write it into the context property the party resolver component is looking for.


After deploying and setting the receive pipeline to use the custom one above, I got party resolution working like a bought one with the BTS.SigningCertificate, BTS.SourcePartyID and MessageTracking.PartyName context properties populated.

I guess I was a little surprised that all this was needed. WCF does a great job of abstracting out all the transport and security bits and moving them to configuration time (no additional code in our service or client). In the HTTP and SOAP adapter days, the MIME/SMIME pipeline component was used to decrypt and validate the signing certificate as well as populating the required context properties. Why doesn’t the WCF Adapter perform this part in the same way? I mean, its doing the decoding, decrypting and certificate validation. So why not the populating of these context properties? Perhaps there is secret squirrel checkbox somewhere I missed. Love to hear comments if anyone has done this differently?

[Updated: 19-10-2010]

Thanks to Thiago (see comments section) we have been able to simplify this further. The WCF adapter provides some “special” namespaces that allow us to instruct the adapter to write context properties in a more controlled way. Specifically we can instruct the adapter to write directly into defined property schema elements (e.g. OOTB BizTalk property schemas or deployed custom property schemas). This allows us to write the certificate thumbprint directly into the BTS.SigningCertificate context property and avoid the need for the custom pipeline component to move the value from the custom header property into the BTS.SigningCertificate property as described above.

To do this we simply change the IDispatchMessageInspector.AfterReceiveRequest to make use of these special namespaces.

                // Write this away as a custom message header
                if (!String.IsNullOrEmpty(thumbprint))
                    // Write the thumbprint directly to the BTS.SigningCertificate context property
                    //  Thanks to Thiago http://connectedthoughts.wordpress.com
                    // Create a collection of context properties we want the adapter to write/promote for us                    
                    XmlQualifiedName clientCertificateProp = 
                        new XmlQualifiedName("SignatureCertificate", "http://schemas.microsoft.com/BizTalk/2003/system-properties"); //Maps to BTS.SignatureCertificate
                    List<KeyValuePair<XmlQualifiedName, object>> promoteProps = new List<KeyValuePair<XmlQualifiedName, object>>();
                    promoteProps.Add(new KeyValuePair<XmlQualifiedName, object>(clientCertificateProp, thumbprint));
                    // Add the property collection to the request
                    //  Use the http://..../Promote to have the adapter promote the context prop
                    //  or use  http:/...../WriteToContext to just have the property written but not promoted.
                    request.Properties.Add("http://schemas.microsoft.com/BizTalk/2006/01/Adapters/WCF-properties/Promote", promoteProps); 


Now we can do away with the custom pipeline component bits and use the OOTB XMLReceive pipeline (as it contains the party resolver component already). The certificate thumbprint will be written directly into the BTS.SigningCertificate context property (and promoted) ready for the party resolver component to use.

Nice work Thiago. thumbs_up

Monday, October 18, 2010 11:50:22 AM (AUS Eastern Standard Time, UTC+10:00)  #    - Trackback
BizTalk General | WCF
# Friday, July 30, 2010


Hold the phone!. Didn't I just pull that folder up in Explorer smile_sniff


Ah…good old File System Redirector. Turns out that on x64 machine’s this folder is located here


We can browse to this location when adding references to the Windows Server AppFabric assemblies to our VS 2010 projects.


All is well again in the universe

Friday, July 30, 2010 4:41:06 PM (AUS Eastern Standard Time, UTC+10:00)  #    - Trackback
VS 2010 | Windows Server AppFabric
# Friday, July 09, 2010


Of course the official site can be found here > http://australia.msteched.com/

Friday, July 09, 2010 3:42:52 PM (AUS Eastern Standard Time, UTC+10:00)  #    - Trackback

I have been avoiding this for sometime now. That is, adding new activity items to the current BAM deployment in production. Production has been running for months now and in this high volume system we partition the BAM activities every week and archive each month (giving the client a rolling month worth of activity data). I was concerned that during the update of the BAM definition this data was going to be blown away (an experience that has caused much embarrassment in the past).

So the procedure I used this time did the trick…well almost

  • Took a “backup” of the current BAM definition using BM.exe

    bm.exe get-config -FileName:MyConfig.xml

  • Added the new activity items using Excel and edited the views
  • Exported the new BAM definition to xml
  • Removed the existing views using BM.exe

    bm.exe remove-view -Name:MyView

  • Deployed the new definition using BM.exe and the update-all command – FAILED smile_cry

    bm.exe update-all -DefinitionFile:MyNewDef.xml
    The error message in the command window was:
    All queries combined using a UNION, INTERSECT or EXCEPT operator must have an 
    equal number of expressions in their target lists.
    Upon investigation, I found that the partition tables did not get updated with the new activity items. As the view spans both the current activity tables and all the partition tables the view creation failed. Interestingly, the BAM Archive tables did get updated.

  • “Upgraded” the partition tables using the script from this blog post

    I did need to make a slight change to avoid some errors that cropped up with partition tables already archived and as such no longer present in the BAMPrimaryImport database (although the original script works).

    I changed the CURSOR definition to filter out those tables already archived:

    DECLARE partition_cursor CURSOR LOCAL FOR
    SELECT InstancesTable
    FROM [dbo].[bam_Metadata_Partitions]
    WHERE ActivityName = @activityName
    AND ArchivedTime Is Null -- Added additional filter
    ORDER BY CreationTime ASC
  • Deployed the new definition again using BM.exe and the update-all command – SUCCEEDED
  • Re-applied security to the Views using BM.exe

    bm.exe add-account -AccountName:TheStig -View:MyView

Unfortunately all my BAM Alerts got blown away smile_baringteeth . Makes sense as the alerts reference the view that was removed. Luckily taking the backup in step one allowed me to pull out the original alert definition and paste them into my new definition file. I re-deployed that using the update-all command and alerts are back to normal.

I did come across this KB 969558 article for BTS 2006 R2 that appeared to address the partition tables issue. It looks as though this did not make it into BTS 2009.

Friday, July 09, 2010 3:12:01 PM (AUS Eastern Standard Time, UTC+10:00)  #    - Trackback
BizTalk General
# Friday, July 02, 2010

Just lost a couple of hours I will never get back…

I am in the process of upgrading one of our RFID applications to 2010 beta. The upgrade went through without a hitch and I found myself the proud owner (or at least caretaker) of a brand-spanking new BizTalk RFID Server 2010.

rfid manager 2010

(No visible differences to BizTalk RFID Server 2009 – except for the icon)

I then proceeded to upgrade my Visual Studio 2008 projects to VS 2010. WOW!!! Build succeeded first time…I am on a roll baby!

So I imported my Process into RFID Manager and fired her up…

rfid 2010 error starting process

The WCF service was not responding. So I open up IIS Manager (my VM is Windows Server 2008 so I’m running IIS 7) and try to browse to the hosting.svc under my RFID process VD and I get a http 401.3 error. Where do I start to hunt this down. I have installed .NET Framework 4.0, VS 2010, BizTalk RFID Server 2010 (plus Silverlight 4, Windows Azure platform AppFabric 1.0, … ).

So I check:

  • Application pool identities are correct – Tick
  • Application pool framework version – v4.0 Classic mode
  • Authentication set to Anonymous – Tick
  • Anonymous user set to application pool identity - Tick
  • NTFS permissions on the VD - Tick
  • Test Pass-through authentication – Green Lights
  • IISRESET (last desperate attempts of a beaten Sco)

Still no go. I get on the “bat-phone” to Ninja Mick and explain what's happening. He just laughs…”What! Another 401 error. That’s about the sixth this week”. He points me off to a KB article about disabling the loopback check (I am using host headers on my RFID Services site). Still no go but it hinted at the problem…host headers.

Back into RFID Manager and look at the Advanced Server Properties

rfid manager advanced props

Even though these settings are technically correct and BizTalk RFID Server creates the provider and process services without error, I needed to set the Host IP address to the host header name I am using. This appears to have changed between either BizTalk RFID Server 2009 > 2010 or a change to IIS, Windows Update, .NET 4.0 … your guess is as good as mine?

After setting this to the host header name all worked fine and I’m back to sending my tag read events into BizTalk RFID Server 2010

Friday, July 02, 2010 11:21:47 PM (AUS Eastern Standard Time, UTC+10:00)  #    - Trackback
BizTalk RFID

During the CTP(s) one way we could provide client credentials to authenticate with the Service Bus was to use username and password. We were writing code that looked something like this:

// create the credentials object for the endpoint
TransportClientEndpointBehavior userNamePasswordServiceBusCredential = new TransportClientEndpointBehavior();
userNamePasswordServiceBusCredential.CredentialType = TransportClientCredentialType.UserNamePassword;
userNamePasswordServiceBusCredential.Credentials.UserName.UserName = this.solution;
userNamePasswordServiceBusCredential.Credentials.UserName.Password = this.password;

The UserNamePassword credential type is no longer supported in the v1.0 release. The types of supported client credential types are now:

  • Saml: this option specifies that the client credential is provided in the Security Assertion Markup Language (SAML) format, over the Secure Sockets Layer protocol. This option requires that you write your own SSL credential server.
  • SharedSecret: This option specifies that the client credential is provided as a self-issued shared secret that is registered with AppFabric Access Control through the AppFabric portal. This option requires no additional settings on the Credentials property.
  • SimpleWebToken: This option specifies that the client credential is provided as a self-issued shared secret that is registered with AppFabric Access Control through the AppFabric portal, and presented in the emerging industry-standard format called simple Web token (SWT). Similar to the shared secret option, this option requires no additional settings on the Credentials property.
  • Unauthenticated: This option specifies that there is no client credential provided. This option avoids acquiring and sending a token. It is used by clients that are not required to authenticate, based on the policy of their service binding. Note that this setting might leave data nonsecure if not used together with another security measure.

[See Choosing Authentication for an AppFabric Service Bus Application for more details]

So where we used to use UserNamePassword, the corresponding credential type is now SharedSecret (although you should check out the other authentication options as well). Here is what your code now looks like:

// create the credentials object for the endpoint
TransportClientEndpointBehavior sharedSecretServiceBusCredential = new TransportClientEndpointBehavior();
sharedSecretServiceBusCredential.CredentialType = TransportClientCredentialType.SharedSecret;
sharedSecretServiceBusCredential.Credentials.SharedSecret.IssuerName = this.issuer;
sharedSecretServiceBusCredential.Credentials.SharedSecret.IssuerSecret = this.secret;

Default issuer name and secret are generated for you when you activate your AppFabric account. Create additional ones using the Windows Azure platform AppFabric Access Control Management Tool (Acm.exe) that is installed as part of the SDK.

Friday, July 02, 2010 2:39:57 PM (AUS Eastern Standard Time, UTC+10:00)  #    - Trackback

# Thursday, July 01, 2010

I am getting a few reports that after a recent windows update (or installing .NET Framework 4.0) the ESSO service fails to restart. Microsoft have released a hotfix to address this (http://support.microsoft.com/kb/2252691)

Microsoft Reports:

This issue occurs after installing .NET Framework 4.0. The registration of the assembly used by ENTSSO to access SQL Server does not specify the correct version of the .NET Framework. When .NET Framework 4.0 is installed, the assembly will try to use the newer framework and then fail to load

To resolve this manually:

32-bit Server

1.       Open a command window
2.       Go to C:\Windows\Microsoft.NET\Framework\v2.0.50727
3.       Type: regasm “C:\Program Files\Common Files\Enterprise Single Sign-On\ssosql.dll”

64-bit Server

1.       Open a command window
2.       Go to C:\Windows\Microsoft.NET\Framework64\v2.0.50727
3.       Type each of the following and hit ENTER:

32bit:  regasm “C:\Program Files\Common Files\Enterprise Single Sign-On\win32\ssosql.dll”
64bit:  regasm “C:\Program Files\Common Files\Enterprise Single Sign-On\ssosql.dll”

On a 64-bit server, regasm will need to be run for both the 32-bit and 64-bit versions of ssosql.dll.

Hope this helps smile_wink

Thursday, July 01, 2010 3:05:59 PM (AUS Eastern Standard Time, UTC+10:00)  #    - Trackback
.NET Framework | BizTalk General
# Tuesday, June 29, 2010

Setting up a new VM and hit this hurdle. Lucky for me Microsoft just released 1.2 of the SDK that supports VS 2010 and .NET 4.0 thumbs_up

Windows Azure SDK 1.2

Also, if you are using Windows Azure AppFabric and .NET 4.0 make sure the relay bindings are installed into the .NET 4.0 machine.config (not done by installer). Check out Wade’s post on how to go about it easily.

Keep those heads in the cloud

Tuesday, June 29, 2010 10:33:41 PM (AUS Eastern Standard Time, UTC+10:00)  #    - Trackback
Windows Azure

In my last post, I was exploring a different approach to implementing a simple windows service. In that solution, I used Windows Workflow to implement a folder monitor service. Part of the solution required me to gather target folder details from configuration. I needed more than the out-of-the-box appSettings collection of key-value pairs. Normally I would use a second xml configuration file and basically deserialise that into my custom collection class. However, I was in exploration mode so I decided to give the System.Configuration classes another go.

I wanted the client to easily manage the target folders to monitor and customise the alert that gets logged in BAM (see the last post). I wanted something like this:

<folderMonitor targetFolder="E:\Data\BizTalk\Monitor1" fileMask="*.xml" timerInterval="1" processingInterval="2">
<alert alertInterval="5" sender="BIZTALK" destination="LOB1" messageType="Order" errorType="Folder Monitor" errorCode="OFFLINE"/>
<folderMonitor targetFolder="E:\Data\BizTalk\Monitor2" fileMask="*.xml" timerInterval="1" processingInterval="2">
<alert alertInterval="5" sender="BIZTALK" destination="LOB2" messageType="Invoice" errorType="Folder Monitor" errorCode="OFFLINE"/>

I was surprised by the amount of code I needed to write. Basically you have to:

  1. Define a ConfigurationSection
  2. Define the ConfigurationElementCollection and implement the indexers and ARC methods
  3. Define the actual ConfigurationElement (the folder monitor structure you see above)
  4. Declare the configuration section in the app.config

Here is the source code   

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Configuration;
namespace Breeze.BizTalk.WorkflowLibrary
    public class FolderMonitorSection : ConfigurationSection
        #region Static Accessors
        /// <summary>
        /// Gets the configuration section using the default element name.
        /// </summary>
        public static FolderMonitorSection GetSection()
            return GetSection("folderMonitorConfig");
        /// <summary>
        /// Gets the configuration section using the specified element name.
        /// </summary>
        public static FolderMonitorSection GetSection(string sectionName)
            FolderMonitorSection section = ConfigurationManager.GetSection(sectionName) as FolderMonitorSection;
            if (section == null)
                string message = string.Format("The specified configuration section (<{0}>) was not found.", sectionName);
                throw new ConfigurationErrorsException(message);
            return section;
        #region Configuration Properties
        [ConfigurationProperty("folderMonitors", IsDefaultCollection = true)]
        public FolderMonitorConfigElementCollection FolderMonitors
            get { return (FolderMonitorConfigElementCollection)this["folderMonitors"]; }
            set { this["folderMonitors"] = value; }
        public override bool IsReadOnly()
            return false;
    [ConfigurationCollection(typeof(FolderMonitorConfigElement), CollectionType = ConfigurationElementCollectionType.BasicMap)]
    public class FolderMonitorConfigElementCollection : ConfigurationElementCollection
        protected override ConfigurationElement CreateNewElement()
            return new FolderMonitorConfigElement();
        protected override string ElementName
            get { return "folderMonitor"; }
        public override ConfigurationElementCollectionType CollectionType
            get { return ConfigurationElementCollectionType.BasicMap; }
        public override bool IsReadOnly()
            return false;
        #region Indexers
        public FolderMonitorConfigElement this[int index]
            get { return BaseGet(index) as FolderMonitorConfigElement; }
                if (BaseGet(index) != null)
                BaseAdd(index, value);
        public new FolderMonitorConfigElement this[string name]
            get { return BaseGet(name) as FolderMonitorConfigElement; }
        #region Lookup Methods
        protected override object GetElementKey(ConfigurationElement element)
            FolderMonitorConfigElement cfg = element as FolderMonitorConfigElement;
            return cfg.TargetFolder;
        public string GetKey(int index)
            return (string)BaseGetKey(index);
        #region Add/Remove/Clear Methods
        public void Add(FolderMonitorConfigElement item)
        public void Remove(string name)
        public void Remove(FolderMonitorConfigElement item)
        public void RemoveAt(int index)
        public void Clear()
    public class FolderMonitorConfigElement : ConfigurationElement
        #region Constructors
        public FolderMonitorConfigElement()
        #region Configuration Properties
        [ConfigurationProperty("targetFolder", IsRequired = true)]
        public string TargetFolder
            get { return (string)this["targetFolder"]; }
            set { this["targetFolder"] = value; }
        [ConfigurationProperty("fileMask", IsRequired = true, DefaultValue = "*.*")]
        public string FileMask
            get { return (string)this["fileMask"]; }
            set { this["fileMask"] = value; }
        [ConfigurationProperty("timerInterval", IsRequired = true, DefaultValue = "3")]
        [IntegerValidator(ExcludeRange = false, MaxValue = 1440, MinValue = 1)]
        public int TimerInterval
            get { return (int)this["timerInterval"]; }
            set { this["timerInterval"] = value; }
        [ConfigurationProperty("processingInterval", IsRequired = true, DefaultValue = "5")]
        [IntegerValidator(ExcludeRange = false, MaxValue = 1440, MinValue = 1)]
        public int ProcessingInterval
            get { return (int)this["processingInterval"]; }
            set { this["processingInterval"] = value; }
        public AlertConfigElement AlertConfig
            get { return (AlertConfigElement)this["alert"]; }
            set { this["alert"] = value; }
        public class AlertConfigElement : ConfigurationElement
            public AlertConfigElement()
            [ConfigurationProperty("alertInterval", IsRequired = true, DefaultValue = "20")]
            [IntegerValidator(ExcludeRange = false, MaxValue = 1440, MinValue = 1)]
            public int AlertInterval
                get { return (int)this["alertInterval"]; }
                set { this["alertInterval"] = value; }
            public string Sender
                get { return (string)this["sender"]; }
                set { this["sender"] = value; }
            public string Destination
                get { return (string)this["destination"]; }
                set { this["destination"] = value; }
            public string MessageType
                get { return (string)this["messageType"]; }
                set { this["messageType"] = value; }
            public string ErrorType
                get { return (string)this["errorType"]; }
                set { this["errorType"] = value; }
            public string ErrorCode
                get { return (string)this["errorCode"]; }
                set { this["errorCode"] = value; }

And my app.config file looks like this

<?xml version="1.0" encoding="utf-8" ?>

Custom config section declaration for Folder Monitor-->
      type="Breeze.BizTalk.WorkflowLibrary.FolderMonitorSection, Breeze.BizTalk.WorkflowLibrary, Version=, Culture=neutral, PublicKeyToken=null"

Custom config section for Folder Monitor-->
folderMonitor targetFolder="E:\Data\BizTalk\Monitor1" fileMask="*.xml" timerInterval="1" processingInterval="2">
alertalertInterval="5" sender="BIZTALK" destination="LOB1" messageType="Order" errorType="Folder Monitor" errorCode="OFFLINE"/>
folderMonitor targetFolder="E:\Data\BizTalk\Monitor2" fileMask="*.xml" timerInterval="1" processingInterval="2">
alertalertInterval="5" sender="BIZTALK" destination="LOB2" messageType="Invoice" errorType="Folder Monitor" errorCode="OFFLINE"/>

Tip: If you do end up GAC’ing the assembly your configuration classes are under, don’t forget to update the type in the <configSections> to use the new PublicKeyToken value. smile_wink

Once all that was setup (Birthdays and Christmas sorted out along the way) I was happy with the end result. In my WF I simply used the static accessor to get my configuration collection and bound that to the Repeater Activity in my workflow. Because of the amount of work involved, I think it would of been much better to stick with my usual approach. That is, using a second custom configuration file and deserialise it into my custom collection class. The real test will be what I choose next time I am asked to implement this type of thing…custom config file or custom config section?

Tuesday, June 29, 2010 10:13:26 PM (AUS Eastern Standard Time, UTC+10:00)  #    - Trackback
.NET Framework | Windows Workflow
<January 2011>
About the author/Disclaimer

The opinions expressed herein are my own personal opinions and do not represent my employer's view in anyway.

© Copyright 2016
Sign In
Total Posts: 64
This Year: 0
This Month: 0
This Week: 0
Comments: 182
Pick a theme:
All Content © 2016, Breeze
DasBlog theme 'Business' created by Christoph De Baene (delarou)