WCF Client Performance Using ObjectDataSource

WCF Client Performance

So everyone knows that WCF clients incur more overhead than their predecessor proxy clients.  The penalty comes from establishing a connection to the WCF service and somewhat depends on the binding you have configured.  Some bindings are more expensive than others, but there is more overhead regardless.  Also, you know that now you have to be responsible for closing your proxy client.  Previously you could just instantiate a proxy instance, call method(s) and let the garbage collector do the rest. A logical and easy way to provide a speedup is to instantiate one client object and save it as a private member variable of your page, usercontrol, etc.  This prevents you from having to create/close more than one client.


If you use an ObjectDataSource with a TypeName of a WCF proxy client, then it will be instantiating a WCF client and incurring the same overhead.  It is pretty common to find more than one ODS on a page, in fact I have seen pages with a dozen or more.  Also, it is common to find multiple ObjectDataSources referencing the same service.  So it is quite easy to see a performance gain from making your ODSources use one client instance.  How?  The ObjectDataSource exposes two critical events that make this speedup possible, and even easy:

Posted in ASP.NET, Technology, WCF | Leave a comment

HTTP Proxy Server

So I came across a question on stackoverflow regarding implementing an HTTP proxy server.  This was a project I had to do while studying computer science at the University of Kentucky several years ago.  It was a project I had enjoyed and I thought it would be fun to re-implement a proxy server in C#.  So I spent a Sunday afternoon and built a nice little multi-threaded proxy server complete with caching! 

HTTPS – To tunnel or not to tunnel

Normally when a browser needs to request an https uri it will still interact with a proxy server by issuing a CONNECT request to the proxy server.  This request tells the proxy server to establish a tunnel from the client to the destination server and relay traffic across the respective connections.  Since the proxy server will only be dealing with TCP level traffic to handle this tunnel, the communication can still be secured between the client and the server.  Well, this was all too easy and I wanted a challenge of creating an https debugging proxy like Fiddler, which is one of the most useful tools there is for a web software developer like myself. 


To do the heavy lifting of the SSL protocol (in one line of code) I used the System.Net.Security.SslStream class which made the project super simple.  I’ll spare the details of the implementation because i posted an article on CodeProject.com.


So I had fun writing this little program and reminiscing about my networking class in college.  Actually the thing performs pretty well when not dumping data and is fun to play with.  I might actually use it to do some http debugging instead of fiddler in the future as it is a quick easy interface from the command line.  With the caching that I implemented, just running it locally for myself provides some pretty speedy pages when caching is allowed.
You can find the source and executable over on the code project article, but I have also included the source here in my SkyDrive.  Let me know what you think!

Posted in General Programming | Leave a comment

SharePoint Site Map

We want to provide a standardized site map across our SharePoint publishing enterprise.  We are using SharePoint to provide a content management solution for our partners, and migrating from Microsoft Content Management Server 2002. Part of the planning is defining the set of web parts that we will enable for our partners to use in creating their own sites and pages.  Our partner count is in the hundreds at this point so we have a wide range of abilities to acommodate.  Some of the users are web design and SharePoint gurus, and some of them have no HTML experience.  It is for the latter that we need to be especially concious.


Our first attempt was to use the out-of-the-box TableOfContentsWebPart.  This a useful part because of it’s customizability, but the same thing that makes it useful also makes it quite complex.  It doesn’t provide a nice nested <ul> rendering out of the box, and we don’t expect a majority of our users to have SPD or any XSL expertise which are both needed to make this thing to what you want.  So we sought to customize the part to render a nested <ul> tree with some custom XSL and then export the customized webpart to the gallery. 
When I started on the XSL template I planned on using a recursive template because I had expected the XML coming from the TOC to be heirarchical (like the data it represents), something like:
    <page title="a page" url="/pages/default.aspx"/>
Unfortunately, the XML is one level and uses a level attribute on the items (level="1", level="2") to be used for transforming.  Not pretty, but it doesn’t matter right? I could still approach this with a fairly simple recursive XSL template.  Afterwards, we had a perfectly good site map that could be exported and reused by anyone.  Even more unfortunately though, the TableOfContentsWebPart has a limit on the number of levels it will emit in XML.  The limit was 3.  When the team came to me and said, "Hey we need this to show more than 3 levels," my first instinct was to think that there must be a good reason for limiting.  Obviously you can’t let this thing run infitely deep, but why 3 levels?  I surmised that 3 levels was probably enough to handle most sites and that if you have more than 3 levels of structure, using a site map wasn’t going to benefit a user.  If at every level you have only 3 items, your tree starts getting big fast… level 4 has 81 items, level 5 has 243 items, level 6 has 729 items!  Yeah that’s a good argument to not go past 3 levels.  Performance aside, who is going to spend time reading through a lot of  items in tree format to find a page or site?  Isn’t that why we have an awesome search bar on every page?  Well, usability and performance are not always of concern to our partners, some of which actually do have sites with structure up to 7 levels deep.


My solution was to write a simple web part using the PortalSiteMapDataSource.  This object was meant to do exactly what we are trying to do and is also built for performance.  Read Chris Richard’s blog entry, Increased performance for MOSS apps using the PortalSiteMapProvider, from the Microsoft Enterprise Content Management team.  It will provide speedy access the the underlying navigation tree and will handle caching for me.  As Chris says, the first call to this provider won’t be as fast as using the OM yourself, because it’s handling a lot of caching heavy lifting in the background, but it will improve the performance of your Site Map overall and provide for very little coding. 

So a couple hours later and some frustrating SharePoint development, we have a standard site map web part in our gallery.  All our users have to do is add the web part to a page and if they want to, write some custom CSS to style it out. 

Posted in SharePoint | Leave a comment

Mobile Device Browser File

We are using the Mobile Device Browser File from codeplex on our SharePoint publishing portal to detect mobile devices in combination with a simple HTTP Module to redirect them to our mobile site.  This works well and as long as we are diligent with getting the latest version of the MDBF, we should be on top of new devices/browsers.  The MDBF currently defines over 60 distinct capabilities of about 400 different devices.  Of those capabilities, we are really only interested in the isMobileDevice value.  However,  many of the other capabilities directly affect the behavior of ASP.Net when a particular browser is detected.  The preferredRenderingMime capability tells ASP.Net what content type should be sent as a response header.  So in the case of a browser that prefers XHTML, and is not a mobile device and therefore not redirected, ASP.Net obliges by setting the response content type.  Unfortunately the markup coming from our SharePoint portal does not meet the standards of XHTML and causes those browsers to break.  We noticed this when we added web slices to the portal and saw that the Windows-RSS-Platform User-Agent was being sent the Content-Type: application/xhtml+xml header. To fix the issue we added a new browser entry that matched the Windows-RSS-Platform user agent and set the preferredRenderingMime to text/html and at the same time defined some other user agents that we expected to have the same problem.  Unfortunately this approach makes being diligent with keeping the latest file from the MDBF team more difficult.  With so many devices we may easily miss some that prefer the XHTML markup but are not mobile devices. 

To solve the problem, I created a very simple (4 lines) HttpModule that Handles the HttpApplication.PostRequestHandlerExecute event and in the case of Response.ContentType==”application/xhtml+xml” sets the ContentType to “text/html”.  This will make it much easier to maintain the MDBF and take care of any other unknown agents. 

Posted in ASP.NET | Leave a comment