Monthly Archives: March 2008

Dynamic sitemap.xml Files in ASP.Net

I know this is not a new topic. It is not even a new topic for me. I have posted on defining what a sitemap.xml file is for, and on dynamic sitemap.xml files in C#. But my team is finally ready to start implementing this as part of our custom development platform for the external brand sites.

When one searches for dynamic sitemap.xml creators in Google, you get a plethora of sites back. Some are code, some are online based tools. Since we are looking to create our file dynamically from within the site on demand, that helps narrow down our search. I have found a small number of code sources we can use to start with.

There is still the HTTP Handler from my original post. This project, ASP.Net Google Sitemap Provider by Bruce Chapman, is available on CodeProject. You can also read about it in a blog post on his iFinity site. It still looks like the most flexible solution.

There is a great looking solution on the ASP.Net site by Bertrand Le Roy called Google Sitemaps for ASP.NET 2.0. It has been ported over into the ASP.Net Futures July 2007 package. This solution is an HTTP Handler that uses the Web.sitemap file to generate a sitemap.xml file on the fly.

Another interesting idea I found in my searches was some code that shows a site map when a user gets a 404 error. This solution is also implemented as an HTTP Handler, but is only for 404 Page Not Found server errors. This code is also available on CodeProject in an article called Generate a Google Site Map Using the HTTP 404 Handler.

Here are some other sites of note to look at. They have similar solutions to the ones above, and it is always a good idea to see what other people have come up with.

If anyone has any additional resources, ideas, or suggestions, please leave me a comment and let me know what you think.

Mix08 – Session 10 – Application = Designers + Developers

This session is based on a big selling point that Microsoft has been driving home for Silverlight and WPF.  Designers and Developers who share the same source code can work on different aspects of the same project seamlessly without stepping on each other’s toes.  The session walked through two different development scenarios to demonstrate this point. 

The first demo was of a furniture design web site.  The developer built the back end that integrated with the database.  He hooked up simple tabs and list boxes to the database for dynamic content.  The Developer then picked up the XAML for the site, and styled each of the page elements to make a slick looking web site. 

The second demonstration was of a Silverlight application called the Deep Zoom Composer.  This is an application that helps users add images inside of images inside of images, like the Keynote demo from The Hard Rock Cafe.  In the same fashion, the developer hooked up the interface to implement all the heavy lifting, and the designer modified the XAML to style the application any way he chose. 

This kind of development and design interaction is extremely encouraging, and could cut down a significant amount of time we spend in construction and user acceptance testing of our external brand sites at BMS.  I am hoping that we work with an agency in the near future that is just as excited about trying out this technology.

Mix08 – Session 9 – Silverlight and Web Analytics

This session was a panel discussion regarding Web Analytics.  The panel was composed of members from WebTrends, Omniture, and Microsoft.  I found this session very interesting, since most of the solutions to track analytics within Silverlight applications are very similar to the ones we implemented with our Flash based RIA sites. 


  • Omniture – SiteCatalyst, hosted solution
  • WebTrends – WebTrends Analytics, hosted solution
  • Microsoft – AdCenter Analytics – Beta2 released March 1
  • These products track information through page tags or beacons
  • With Silverlight (and other RIA platforms like Flash, Ajax, Etc), you don’t change pages.
  • You have to create and define pseudo-page views
  • 4 Scenarios:
    • Tracking Silverlight Installation
    • Tracking user Interaction
    • Tracking media Drop-off
    • Tracking Media Buffering

Silverlight Installation

  • JavaScript file to put on site
  • Silverlight.isInstalled method identifies if it is available
  • Check for each version, give them an experience for that version

Tracking user Interaction

  • Determine actions in your pipeline, funnel, etc.
  • Add Event handlers for each action
  • Event handlers map to page view equivalents

Tracking Media Drop off

  • Add invisible media markers every 5 seconds in the video
  • Media Markers trigger events
  • Events trigger page views
  • You can then monitor drop-off in 5 second increments

Tracking Media Buffering

  • Handle  the MediaElement.CurrentStateChanged event
  • When State goes to Buffering, trigger MediaBuffering page view
  • Correlate bit rate, content, geography, etc.

Analytics can bring you a single goal

  • This is through A / B Testing
  • Separation of design in XAML and code in JavaScript enables simple A / B design
  • In JavaScript or on server, for X% of visitors – show different XAML
  • Use analytics service to track difference between results in variation

Wrap Up

  • There is a Silverlight sample available –
  • Track Geolocation with these projects
    • Akami Edgescape
    • Windows Live

Mix08 – Session 8 – The Future of Advertising Technology

This session was very interesting.  As a technology professional, the business side is not as transparent as it could be sometimes.  This session opened the door to understanding how advertising, both online and traditional media, work today and could work in the future.  Microsoft is investing in this vertical very heavily, and through some of these ideas is looking to become a major player. 

Market Overview – Now and In The Future

  • The advertising market is manual – media is purchased through phone calls & emails
  • The advertising market is opaque – there is no pricing transparency
  • The advertising market is inefficient – there are loads of remnant inventory that drives prices lower
  • Ad networks are the most efficient way to procure advertising
  • They can buy them on CPM basis (cost per thousand) and sell at CPC (cost per click) or CPA (cost per acquisition)
  • Ad Exchanges – they add transparency, increase liquidity by letting advertisers bid & buy across all networks
  • Today – advertiser & agencies come up with marketing goals, the agency will change the mix of ad media manually to match
  • In The Future – advertisers and agencies will define a media plan, translate them to business rules, and through automated experiments, an optimization system can evaluate and adjust the advertising mix in real time
  • Each impression’s value can be set in real time, adjusted, and shift based on campaign objectives (awareness vs conversions, etc)
  • In the future there will be very few analysts, and each of them will be dealing with millions of publishers through automated optimizers and exchanges
  • Agencies and Advertisers (Buy Side) will have to be open
  • Today – Premium sales force manages most ads, remnant sales force has less
  • Future – Automated Systems will take away from premium and leave very low or valueless markets out

Optimizer Architecture

  • seamlessly add advertising into any content, application, or device
  • seamlessly leverage other Microsoft platforms
  • Create ad funded businesses all in one platform

The  Brave New World of Advertising

  • Nokia – leveraging the power of nanotechnology
  • Hypertargeting, personalized advertising
  • Personalized product offerings (like Nike – build your own shoes, custom Mini Coopers, Scions)
  • Projectors, OLED, Disposable video
    • Low resolution projectors are really cheap now
    • OLED will become printable
    • Siemens Printable Video Displays – with printable batteries
  • Tracking & Measuring advertising offline like online
    • RFID
    • GPS Phones
    • 2D Bar Codes
    • Bluetooth, etc.
    • Neural Scanning

Mix08 – Session 7 – ASP.Net Model View Controller

This session was conducted by the famed Scott Hanselman. I had been looking forward to this session since I heard he was speaking – one, to meet him, and two, to learn about the MVC framework and design pattern. I had been neglecting my duties as a technologist to follow up on MVC, and this was my time to catch up. Besides, we had already bumped into Scott at the Scavenger Hunt, and he was pretty cool about that.

Here are my notes from the session about ASP.Net MVC and the design pattern

  • MVC is a new web project type for ASP.Net
  • This type of project is more easily testable
  • It is not a replacement for web forms
  • This is only an option
  • You must be using .Net 3.5 to be able to create ASP.Net MVC Application Solutions
  • Select a testing framework (nUnit, WatiN, etc.)
  • 3 new namespaces – System.Web.Mvc, System.Web.Routing, System.Web.Abstractions (Now part of ASP.Net)
  • The framework plays well with others – NHibernate for Models, Brail for Views, Whatever for Controllers
  • Clean separation of concerns – easy testing, red/green TDD, highly maintainable by default
  • Extensible and pluggable
  • Clean URLs and HTML – SEO and REST friendly URL structures
  • Great integration into ASP.NET
  • MVP (Presenter) vs MVC (Controller)
  • Request into controller, then to the model for data, then send the data to the View for display
  • The http handler does all the interpretation
  • Routing is kind of like URL Rewriting – define routes, and the URLs will redirect you to where you need to go
  • NHaml, Nvelocity – open source view engines
  • ViewEngieneBase – extend to make all kinds of views, iCal, vCard, RSS, etc, etc.
  • System.Web.Abstraction – Testing without firing up IIS
  • RhinoMocks, TypeMock, nMock, Moq – all kinds of mocking frameworks to use with the abstraction namespace
  • TDD – write tests first. You want to see them fail, then write the least possible to make the test pass. Write, rinse, repeat.

I couldn’t believe the amount of information Scott covered. He explained the MVC pattern, the new .Net framework, the new namespaces, gave examples of routing patterns, how to use different view engines, how to mock, and TDD. This was by far one of the best sessions for me in the entire conference.

Mix08 – Session 6 – Social Networking

This was another panel discussion about social networking. Guy Kawasaki was the moderator. Following the Steve Ballmer keynote, he kept things interesting, and asked some of the hard questions. My notes are scattered, and the session was interesting, but one disappointing fact is the panel did not really cover the use of Web 2.0 and Social Networking inside the Corporation. Read on, and download the session from to see more.

What are the issues?

  • Security
  • Large horizontals vs niche networks
  • Each site behaves as if you have not used any other site before – antisocial
  • “Friend” vs “Family” vs “Colleague” – how do you label people
  • Privacy, how to give users control over their own data
  • Spam issues
  • signal to noise problem – how do you overcome that?

Is stalking as much of a problem as the media makes it out to be?

  • Companies in this space spend the majority of their time on spam
  • Persistent identities prevent the threats
  • You can validate they are real people by validating against their email addresses


  • OpenID begins to persist your identity across sites
  • OpenID is useful, and easy to use, but there is no real gaping problem that OpenID solves
  • Usage needs to become ubiquitous and under-the-covers to work, like SSL

What about media, photos, events, shopping?

  • Entire ecosystems are designed to tie them all together

What about Second Life and World of Warcraft?

  • not necessarily a real identity
  • there are issues with crime
  • users create an alternate reality, rather than extending your actual persona

Casual Games

  • These have taken off in Asia
  • Games like bejeweled, Tetris, KDice
  • They add a social aspect to simple online games

Mix08 Session 5 – The Open Question

This is a panel session talking about Open Process, Open Source, Open Development, and Open APIs.  The panelists were Mike Schroepfer from Mozilla, Andi Gutmans from Zend, Miguel de Icaza from Novell, Rob Conery from Microsoft, and moderated by Sam Ramji from Microsoft.  The session was interesting… it provided a lot of perspective on how the Open Source community views itself, how it operates, and how it is expanding.  Here are some of the topics that were covered:

  • The discussion of patent infringement and Open Source is in conflict
  • The idea of Open Data, for example the collection and sharing of personal data for advertising purposes
  • The acquisition of Yahoo – PHP will be injected into Microsoft and accelerate open source ideas, PHP now can run on Windows Server 2008
  • Debate that opening source code should increase security vs keeping it closed and leveraging Security by Obscurity
  • Not a lot of full open source products – DotNetNuke, Druple, but other Open Source APIs like PHP
  • Criteria for using open source?  All?  None?  Blended!
  • Criteria for making your next project an open source project…?