BizTalk LOB book complete in April 2011

I have been working as a technical reviewer for a book that will be complete in April 2011 called BizTalk  2010 Line of Business Systems Integration from Packt Publishing. Pre-order is now available for the print form but some of the individual chapters are available now. Please check out this link for more information about the book:

It provides useful documentation of integration with Dynamics AX and CRM as well as various cloud-based services. The SAP chapters are very valuable and extend the currently available MSDN documentation for  BizTalk integrations with SAP. This book would be an excellent supplement to your technical library. Thanks

Passed 70-583 on Windows Azure

I just found out that I passed the Windows Azure beta exam! I took it back in November and have been waiting for such a long time to find out the results. I did hear that this exam will require re-certification every 2 years due to the constant rate of change of Azure.

For other people that have been waiting on their results, I checked the Prometric Candidate History and it changed from “Tested” to “Passed”. I had not yet gotten a “Congratulations on your new certification” email that usually comes. Apparently the results have been delayed for some reason.

— Update 2/18/2011 —
The exam is now showing on my Microsoft Transcript so the scoring process should be back to normal now.


Integrating BizTalk 2010 with CRM 2011 Online Organization Service


After making the post earlier today about how to interact with the CRM 2011 Online Discovery Service I remembered this was only part of the story. After getting the security information back from the discovery service it is necessary to call the organization service to work with the entities. Some of the organization service functionality is actually new to CRM 2011.

In CRM 4 the Discovery Service existed already but some aspects of the Organization Service are different. I found that the generation of the schemas for the organization service to also be more difficult.


The service URI for the organization service is but this only gives you part of the WSDL. If you take the WSDL from this address into Visual Studio it will not generate the BizTalk schemas successfully. In fact if you try to add a service reference using the WSDL from this address you will get an error and in the app.config will see comments mentioning that svcutil did not understand the policy assertions.

I looked at the WSDL generated from the above service reference and noticed there was a referenced WSDL file. So if you then try, this will give you the full body of the WSDL you actually need to generate the schemas. I copied the much longer WSDL file to my Sky Drive so you can generate your schemas based off of this: Here is the updated download with my generated artifacts: The organization service will generate quite a few port types.

When I tried compiling after generating the schemas for the organization service, I received a large number of errors. This also occurs when adding a service reference to the organization service. I have a feeling that you may need to reference a CRM assembly to reuse the types appropriately when generating the schemas, but I do not really know how this works at this point.


So now you have the schemas for how to call the Discovery and the Organization Services from BizTalk. It looks like all that is left is just an orchestration and a few ports once you can get the organization service schemas to compile. Thanks!

Integrating BizTalk with CRM 2011 Online Discovery Service


On the MSDN forums there has been quite a bit of traffic and many questions about how to integrate BizTalk with Dynamics CRM 2011. Since I did not have a local CRM 2011 environment setup I thought it would be a good time to look into whether a trial of CRM 2011 existed. Yesterday I setup a trial of BPOS to test out forms services in the cloud. I found out that there was an offer running on a free trial of Dynamics CRM Online so I joined up. This trial was nearly effortless because i was already signed up as an Azure customer through my MVP/MSDN member benefits. Here is a link to the trial offer:

In this post I will document some of the integration experience I have encountered from working with BizTalk and CRM 2011 for basically just a few minutes. I have been very active in discussions in the MSDN forums about how BizTalk relates to CRM 2011 so I was aware of some of the current challenges. A few people had mentioned they were having trouble generating the BizTalk schemas. So this is something I tried out and was successful doing. The bigger discussion about BizTalk and CRM 2011 revolves around the adapter used for integration. In CRM 4 there was a specialized adapter but no specialized one has been found so far for CRM 2011. As far as I can tell the intended approach is to use the WCF adapters.

My hope is that the use of the WCF adapters will enable CRM implementations to achieve high availability. This has been a perplexing and difficult for one of my customers due to known limitations in the CRM 4 adapter.


After signing up for the CRM Online 2011 trial, I was brought to the main form for working with my CRM data, which is https://<organization> This form reminds me quite a bit of working with SalesForce contacts and accounts but the overall feel is much more responsive and a much richer experience. The SalesForce contact and account management features were mind numbing when I worked with them. In contrast, the CRM Online forms are invigorating and stunning.

From my previous investigation into CRM 2011 and BizTalk integration I had found the following source code example: This example shows us how to communicate with the CRM Discovery service which then provides credentials to use when contacting the organization service to work with the CRM entitites. The following line of code is the most important towards pulling out some useful artifacts for BizTalk integration:

CrmDiscoveryService discoveryService = new CrmDiscoveryService();
discoveryService.Url = String.Format("https://{0}/MSCRMServices/2007/{1}/CrmDiscoveryService.asmx",
                        _hostname, "Passport");

The value for the _hostname variable refers to the organization URI <organization> I built up this address by hand and entered it into the browser so that I could see the service description page. So the full address would be This gave me the service description page where I could get the WSDL for the discovery service. I copied this file to my SkyDrive so that others could use it: This file is critical in generating the schemas for BizTalk. Apparently this is not a new technique for CRM – it is fairly well documented:

Next I opened up my VM with BizTalk 2010 and created a new solution for the integration. I copied the WSDL file mentioned above to a location where I could use it on my VM. First I tried creating the schemas by adding a service reference to my BizTalk project. This unfortunately did not create any schemas for me but it did add the service reference successfully. So the next attempt was to Right-click on my BizTalk project and go to Add Generated Items…\Consume WCF Service. Then in the wizard I specified the WSDL file from above. This helped me to generate the schemas successfully.

The schemas that were generated looked quite a bit different from really any I had seen before. Maybe the syntax will not be new to you but I thought it looked unusual. Many of the elements in the BizTalk schema designer have brackets like you would see for an xs:Any element because they are complex types. Here is a picture of the generated schema:

The schema given above actually gives you quite a bit more detail than the service description page which just mentions the Execute method.


Interacting the CRM 2011 Online Discovery service is actually very easy and is quite similar to the way it was implemented in CRM 4. I zipped up the generated schemas and placed them on my SkyDrive here: If you are having trouble generating the schemas you can just use mine. What has been shown here is only part of the overall solution for integrating between BizTalk and CRM 2011 Online but it gets you started nicely.


Using SharePoint Online for InfoPath Forms Services


Today I thought it would be fun to do more research on what InfoPath looks like in the cloud. There had been a few hints about Forms Services running on SharePoint Online and some people were suggesting that it would only work once Office365 came out. Today I started a BPOS trial at and created a forms library to test out my strategies on InfoPath.

To give away the spoiler, I got it to work successfully. Apparently this is currently an unsupported feature (I think until Office365 comes out), but it does work. Here are the steps that I did to test this out. In a recent post I talked about how the license changes to MOSS 2010 no longer provide forms services as a feature when you use the standard CAL. But BPOS is relatively inexpensive so I think it provides a cost-effective replacement for the MOSS functionality. I found the documentation a little lacking on how to do this so I thought I would put together a walkthrough of how I got it to work.


1. So the first thing I did was make a sample form in InfoPath:

2. Then I save it and go to Tools\Form Options to enable it to work in a browser:

3. Then I just have to go through a couple pages of the wizard to publish to a Microsoft Online site.

You have to authenticate with SharePoint Online at this point. Then the next page

Then choose the forms library from your SharePoint Online site that you want to publish the form for. In the following screenshot I show a couple forms libraries. I actually used a different form library for the later pages of this wizard called OnlineFormsLibrary.


Then you get a page that says the form was published successfully:

4. Then I wanted to show the form from my browser even though I had InfoPath installed. On the form library settings you can specify that you want the form to display as a web page which will show the forms services rendering that does not require InfoPath. Here is the form for specifying this:

5. Then you can view the form by making a new item in the forms library. Here is an example of my form being rendered through Forms Services:


So Forms Services does work with SharePoint Online, even with just a BPOS trial. This is exciting news. As far as I know, BizTalk cannot currently integrate with SharePoint Online for document library functionality because the WSS adapter web service does not exist in the cloud. The next thing I will try is being able to post the results from InfoPath back into a custom WCF service in the cloud. From there the data could be sent back into an on-premise hosted BizTalk server via the service bus. This whole integration feels like sending something from outer space back to earth but I feel good just knowing that at least it is possible. 🙂

Azure: How to Check your Bill


I have been playing with the Azure toolset lately in preparation for taking beta exam 71-583, which is the MCPD (Pro) exam on Windows Azure technologies. I had worked with these technologies on and off over the past 2 years during the early releases. One thing I have heard from many people is that they will see a bill come in somewhat unexpectedly. During the early pre-production releases everything was free so I did not worry about the charges. I signed up with the cover of my MVP MSDN subscription but I did need to enter my credit card information in case of overages. It is like a hotel – you still have to give your credit card for the mini-bar, long-distance room calls and the movies on the tv.

One thing that seems to be relatively poorly documented is how to check your bill or tab. I gave this feedback on a recent MPRP study but apparently it has not gotten through about how many in the community seem to be stunned by the charges. So I thought it would be good to do a quick post on how to check your bill in the current Azure product. While some of the screenshots seem relatively self-explanatory, you do need to choose 5 links to drill down deep enough into the account details to actually get the meaningful charges information. This really should be easier.


  1. First, go to one of the Azure configuration portals, either at,, or If you have used the service, you should see a screen similar to the one for me below:
  2. Click on the “Billing” link in the upper right hand portion of the screen under your Windows Live Id.
  3. The next page displayed will be, which you could alternately go to directly. You will then need to authenticate again. After authenticating with your Windows Live ID the site will be shown similar to the picture below:
  4. So the next thing to do is to click on “View My Bills”. Again, this seems obvious but this is the only entry point I have found into this important report. A pop up window will open. The popup will load a list of your bills as shown for me below:
  5. Next you click on View Online Bill/Invoice. This will bring back an itemized list of your current charges. You can see my bill below. I have been using some of the Azure services but it is covered up to a point by my MSDN benefits:
  6. Finally you have to click on the links like AppFabrc Usage Charges, Data Transfer Usage Charges, or SQL Azure Usage Charges to get the real statistical information about how much you owe. For MSDN subscribers, this is where you check how many minutes you have left in your plan :). Below I show my charges for the data transfer:
  7. I tried taking the URL for this report and copying it out to a different browser session but was unsuccessful. I wish it were easier to see the expected charges or maybe get a text if I were about to be charged actual money. I wonder if there is a cloudapp for that.

Good luck managing your cloud accounts! Thanks,

WebSphere on Windows (WOW)

Microsoft recently conducted some interesting studies on running WebSphere on Windows. Their research is published at the website, which has an intro splash that looks more like a flash-based Valentine’s card than anything else. I recommend clicking the "Skip Animation" link and digging into the research information. This is a compelling study to consider running Windows for WebSphere applications rather than IBM OSes and Unix variants.
I look at this information as being  another step forward, similar to announcements like integrating PHP with Windows Azure data resources (see
The Stock Trader sample download that was used in the benchmarks for the study includes some very interesting code from a WCF perspective. I recommend everyone explore this sample for more information about the study.

Updated Cloud SDKs Out

I just got an email from the Windows Azure team that there are new SDK versions for the Windows Azure offerings. Unfortunately, SQL express is still listed as the database version for the fabric storage client. One cool improvement is the addition of a role in the platform for Silverlight.
Here are links to the updated (January 2009 CTP) SDKs:

There is a rainbow in the cloud! 🙂

Cloud Development Quickstart

I have been learning about working with Windows Azure for the past two weeks and have run into my share of challenges. Getting up to speed with the tools has taken quite a bit of work and there have many pitfalls towards getting an environment running to work with the Cloud. In this post I am going to mention some things I have done to get a cloud environment rolling so that others can use this as a guide.
First, you need to get all of the relevant software together to use Windows Azure. Here is an install list I went through to have everything to get started. This is helpful because the .NET Services SDK that was released after PDC has updated functionality. Some of what I have done for my environment is a little outside of the setup directions but was the most useful for me:

Then there is some additional configuration to do to get the Azure SDK to use your SQL Server instance other than SQL Express which is the default. A few posters had mentioned this technique but there were a few missing steps. Follow these steps to get the Azure SDK to use your local SQL 2008 instance (or a different one):

  • Open Windows Explorer to c:Program FilesWindows Azure SDKv1.0bin and find the DevelopmentStorage.exe.config file.
  • Modify this file so that it refers to your local SQL 2008 instance such as:

add name=DevelopmentStorageDbConnectionStringconnectionString=Data Source=benc-vistabase;Initial Catalog=DevelopmentStorageDb;Integrated Security=TrueproviderName=System.Data.SqlClient />

add key=ClientSettingsProvider.ServiceUri value=“” />

service name=Bloburl=>
service name=Queueurl=>
service name=Tableurl= dbName=developmentstoragedb/>

  • Then open the Development Storage by going to Start Menu -> All Programs -> Windows Azure SDK -> Development Storage. This will start running the storage services and you will see an gray box icon in the tray. Right-click on this and click to open the Storage UI. The first time you do this it will ask to run some administrative tasks to create the database specified in the config file above.
  • This will get the Blob and Queue services running but the Table will start and then stop. You will need to specify a different database for the Table storage. One that works is the ReportServer database installed with SQL 2008. In the Development Storage you can click Tools -> Table service properties and then choose the ReportServer database. This can be changed later, but to get the Table service running this is a temporary workaround.
  • Then check the Table service and stop and restart the service and it will then no longer stop after a few seconds.
Then you will need to get the Azure codes so that the two administration sites work for testing in the cloud.  After you get the code and validate them then you can associate your accounts with Windows Live Ids. Finally you will be able to access the Cloud administration sites. Unfortunately, there are 2 different cloud administration sites so be prepared to spend some time getting used to the user interfaces. Here are the starting links for the 2 cloud administration sites: for Windows Live and for .NET Services. Understanding that there are two different cloud administration sites can be hard at first. The Azure MMC provides a simpler interface for working with the .NET Services administration so this is strongly recommended as well.
Next you should start learning about the Cloud using the brief MSDN documentation like the Quick Lap around Windows Azure Tools for Visual Studio ( This will get you started on the Cloud development samples as well as actually deploying services to the Microsoft Cloud. 
I wanted to mention one site I have been following for Cloud news is This site provides some really helpful information across the new Cloud industry especially considering Amazon’s Cloud offerings and recent Windows Azure updates. Once you get your hands dirty with the Cloud setup, I have found that getting a broader awareness of the Cloud industry pretty helpful.
The capabilities provided with the Cloud platform are enormous, but unfortunately the ramp up to using them at this point is steep. This blog post provided a checklist of things to do to get rolling on an environment for working with Cloud services. Let me know if you have any questions with this information. Thanks!

BizTalk and the Cloud

Today I was at the pre-conference part of TechEd in Orlando. I attended the WCF/SOA overview by Juval Lowy and picked up a few interesting details. In my personal learning I have been working through the Learning WCF book by that Indigo girl ( and have been working through it on my train ride into Chicago. Lowy has another book in O’Reilly series and I found the content of his talk today to be roughly parallel to the content of the Learning WCF book. The Learning WCF book targets a lower audience level than the Programming WCF services book that Lowy has ( but I wanted to have a better foundation on WCF. Ok, enough of the rambling.
So a few topics on WCF that I thought were most interesting. Lowy talked about the DurableService attribute, which can be added to a WCF service implementation to provide some of the persistence typically associated with WF in .NET 3.5. This is a closely related feature to BizTalk’s concept of orchestration dehydration. For more information on DurableService see ( – unfortunately is an Orcas post so no guarantees). DurableService enables a WCF service to function as a long-running process, similar to BizTalk long-running transactions. It is very interesting to see a couple of different options available for service persistence and the flexibility to choose to either use this via WCF or WF or both. In BizTalk design you might typically use a tier of servers for processing messages received through adapters, and this capability of separating the service persistence in either WCF or WF means that it will be possible to separate a service across more than one physical server and have WCF used on a separate physical tier while maintaining process persistence.
Here at TechEd there are many hands-on-labs (HOL) available running concurrently while the other conference sessions are occuring so if you want to dive into something you can jump right in. I was looking at a HOL on Developing Workflow Services via VS 2008. This showcased further .NET 3.5 technology which goes a long way towards replacing the business process functionality of BizTalk. I was amazed that it is possible to expose a WF process as a WCF service and it was very interesting to hear that when a WF sequential process calls a WF state-machine, you can use correlation to coordinate messages between the processes. If you are experienced on BizTalk you can slowly see Microsoft introduce technologies that will eventually replace BizTalk functionality and its interesting determining which ones match BizTalk functions. One area I have been wondering about until today was which WF or WCF technology would handle message correlation for the various forms of message exchange patterns in which correlation is required. This HOL shows how to handle service correlation which should match the BizTalk functionality as long as the integration partner exposes a WCF endpoint.
Overall, it has been very interesting today. I will be continuing to post throughout my time here so check back later! Now its time for me to get some food. Bye!

Blog at

Up ↑