Thursday, September 30, 2004

Mystery Server ODBC Error Solved!

Several weeks back, I posted a problem with updating MCMS channel properties using a custom web form. Each time I set the Channel.StartDate property (or any of the channel's properties for that matter), it sent back a "Server ODBC error. Please contact the administrator" message. No further details were given.

What's really strange was that we have been using the same set of codes for months now and it didn't give any problems until recently.

So, after talking to Microsoft Support and running a few tests of our own, here's what we found:

1. The "Sever ODBC Error" is really a general error message that is sent back from the server whenever an error occurs while updating/reading the database. It could mean anything - from passing in a bad GUID as an input parameter to an unsuccessful attempt to get the lock on the table to be updated.

2. We ran SQL Profiler. In our case, the error generated by SQL is always: #1222. Error #1222 means that an attempt to get a lock has timed out. And the error occurs at the point in which the channel's properties are being updated on the database.

3. The error does not always occur. And when it does, it disappears when the web service is restarted.

2+3 led us to believe that there are locking issues with the database implying potential problems with the code. But what could be wrong with it? Afterall, we are using only methods from the MCMS PAPI.

Looking at the code on the web form, we observed a pattern. The form contained two key buttons:
a. Update saves the contents of the form.
b. Publish makes it available online.

When authors work on the web form, they would typically update it before publishing it, clicking Update before Publish.
Update -> Publish

Behind each button is a call to a CmsApplicationContext in the required publishing mode. And herein lies the problem. When the CmsApplicationContext of Update() does not release the lock, the CmsApplicationContext of the next transaction, Publish(), is not able to get hold of it. As a result, the error is raised.

The same error is raised when two authors click on the Publish/Update buttons at the same time.

Okay, so the code was causing this lock contention, but what can be done to prevent it? Looking at the API, there isn't a SignOut/Logout/Close function once a CmsApplicationContext is established. The closest thing we get is the CmsApplicationContext.Dispose() method.

So I added Dispose() for each CmsApplicationContext created at the end of each transaction... and voila! The errors disappeared.

Actually, the whole thing has been documented. Take a look at the documentation for the StartDate property which says:
This property can only be set if the CanSetProperties property has a value of true for the current User and the PublishingMode is Update. However, even if both of these conditions are satisfied, an attempt to set this property can still fail such as when, for example, the object is being edited concurrently by another user. Therefore, setting this property should be enclosed in appropriate try...catch blocks.

But why oh why doesn't it give a more descriptive error message?

The conclusion is: Calling Dispose() is a *must* especially when working with the longer running CmsApplicationContext object.

Thanks to all who have helped solve the problem. Special thanks to Jason Ren, Angus Logan and Stefan Goßner. We just rolled out the latest release of the site today, and it feels good to have the site running smoothly once again :-)

Yuuki the Jack Russell Terrier

A couple of months ago, we adopted a Jack Russell Terrier puppy, named Yuuki. Here's a pic of him at 3 months of age.

Thursday, September 23, 2004

3G Hits Singapore

Finally, the first public test of 3G is available to consumers! SingTel launches a month long trial for 150 users.

Thursday, September 16, 2004

Building Websites with Microsoft Content Management Server 2002

Joel Ward, Stefan Goßner and I have written a book that shows you how you can build an MCMS website from the ground up.

A fast-paced and practical tutorial guide for C# developers starting out with MCMS 2002

  • Learn directly from recognized community experts
  • Rapid developer level tutorials build logically through out the book
  • Develops a feature rich custom site incrementally
  • Tips and Tricks from developer newsgroups and online communities

More Details >>

The book is targeted to be available in the October-November time frame. Pre-order now and save up to 30% off the list price.

Friday, September 10, 2004

Using dtSearch and MCMS

So far, every site that we have been asked to do requires a search page. And search, is one component that does not ship with MCMS. Not that it's a bad thing, it gives us the choice to use something that fits our requirements.

We have looked at some of the solutions that integrate with MCMS. They probably work well. But what we wanted to do was to make use of existing licenses that we have purchased for our other sites. In our case, that is dtSearch.

Our requirements for search were simple:

1. The search engine needed to crawl the entire web site that contains framed pages and add meta tags to its index
2. The results displayed should be filtered according to a user's rights. If he doesn't have rights to a posting, he shouldn't be seeing it in the results
3. The results can be retrieved from an index using simple queries
4. It's got to be able to index a windows authenticated site.

Requirements (1) , (3) and (4) could be satisfied by the latest version of dtSearch. The only item that required some customization was (2).

The nice thing about the product is that it is amazingly simple to setup and use. Here's how you can do it too.

Installing dtSearch
First, of course is to install dtSearch itself. This is fairly simple. Just click the setup.exe application on the CD and when its done installing, apply the latest upgrades downloadable from the web site. If you don't have dtSearch and would like to evaluate the software, you can download a 30-day evaluation copy from the web site.

Indexing Meta Tags
For the search engine to recognize meta tags, we provide dtSearch with a list of meta tags used by the postings.

With dtSearch Desktop opened, select Options : Preferences from the toolbar.

In the Preferences dialog, select Indexing Options : Text fields. Add each meta tag that you need indexed into the list. One important piece of information added at this point is the GUID of each posting. Indexing the GUID allows us to use it to get instances of the posting later when coding the search page.

Later, complex queries that filter postings based on meta tags can be written.

Using Windows Authentication
Our MCMS site uses Windows authentication. So we had to specify a user name and password that would be used by the spider to crawl the site. We chose an account with subsciber access to all the postings that needed to be index. The user name and password were entered into the Indexing : Options Spider dialog (part of the Preferences dialog).

One drawback to this particular dialog is that is does not mask the password. Whatever you enter appears as clear text. Fortunately we had a shared account that was used solely for crawling, so this didn't bother us.

Defining the Site Map
Not all postings on the site were linked from an index page. In order to have these postings crawled by the spider, we created a single HTML page that contained links to all postings on the site. This was done by a recursive script coded using the MCMS PAPI.

Creating the Search Index
Next, we create a search index. From the dtSearch Desktop, select Index : Create Index. Specify the name of the index and it's location on the disk. The empty index will be created.

Crawling the entire web site for the first time
Now that the search engine had been configured, we were ready to build the index.
Because postings don't exists as files in folders, the only way for them to be added to the index is to use a spider to crawl them.

1. From the desktop, select Index : Update Index.
2. Select Add web...
3. set the Starting URL for Spider to point to the sitemap created earlier or the index page of your site.
4. set the crawl depth. For our case, a crawl depth of at least 2 was required for the spider to crawl postings in a framed site.
5. Click OK

And that completes the configuration. You can select the other options but these are the required options.

Click the Start Indexing button on the right of the dialog and watch the spider go!

The time it takes to index an entire web site depends on a variety of factors. We run ours on a relatively low-end server and the site has roughly 20,000 postings. It takes approximately 6 hours to complete a full job. On a higher-end server, it takes significantly lesser time to index the same number of pages, about 3 hours.

Building the Search web page
Another good thing about dtSearch is the developer's API. You must download and install dtSearch Developer which contains the libraries and documentation in order to work with it.

You can code with ASP.NET (C# or VB.NET). You can perform just about any kind of query. We have done keyword matches, date comparisons, category filters and so on. I won't go into details here, but samples can be found online on the web.

Here, requirement (2) was satisfied by checking to see if Searches.GetByGuid() returned a null value. If it did, then the user did not have access to the posting and the posting was not added as part of the search result.

All in all, the conclusion is: You can use just about any search engine that uses a spider to crawl MCMS web sites. It's more important to find one that has the features you require and the price tag that fits your budget.