Fixing the Timer Service when everything breaks down

This is a pure troubleshooting post (that will save you days!) if you experience any of the following problems:

  • You have trouble deploying solutions to a your farm (more than one node) where nothing happens the deployment never completes
    • You may be able to complete deployments by executing “stsadm –o execadmsvcjobs” on every server in the farm
  • You experience frequent CPU spikes of 100% every few minutes on your frontend servers and they are all but unresponsive at those times
  • You find one of the following lines in the ULS logs (the SharePoint logs)
    • OWSTimer.exe, w3wp.exe (Id: 888k) : “File system cache monitor encountered error, flushing in memory cache”
    • OWSTimer.exe (Id: 8xqx): “Exception in RefreshCache. Exception message :Unable to load one or more of the requested types. Retrieve the LoaderExceptions property for more information.”
    • OWSTimer.exe (5utx): “The timer service could not initialize its configuration, please check the configuration database. Will retry later.”
  • “Some” of the administrative pages in the central administration occasionally fails with the error
    • “Unable to load one or more of the requested types. Retrieve the LoaderExceptions property for more information”
  • The timer and topology cache is never updated (see next section…)
  • PSConfig (GUI wizard or not) fails to configure your server and the log file reveals the “LoaderExceptions” error above.

The above problems will probably affect all servers in the farm.

Note: The fusion log troubleshooting will be applicable to all .NET loader errors, not just SharePoint specific stuff.

First shot: Clear the Config Cache

A little known fact is that SharePoint maintains a disk cache (and memory) based configuration on every server that contains the topology information and timer job definitions. Go have a look at “C:\Documents and Settings\All Users\Application Data\Microsoft\SharePoint\Config\<guid>\”.

Sometimes old stuff can get stuck in there and you can kick start the timer by clearing it.

The procedure is simple (do it for every server):

  1. Stop the Administration and Timer service
  2. Delete all XML files in the directory (not the folder itself)
  3. Open “Cache.ini” and write the number 1 instead of the existing number (you might want to make a note of it)
  4. Start the services again
  5. Wait for a minute or two and see if the folder starts to fill up with xml files. It is likely that it will contain less than before clearing it.
  6. Check the cache.ini file. If it’s accessible and the number is considerable greater than 1 your cache has been properly initialized and chances are that your problems are now fixed. It didn’t fix my problem, so you may need to read on… (if you didn’t have the “888k” log entry mentioned above you probably have it now)

The above procedure is grabbed from tomblog’s excellent post (that will also help if you actually did delete the folder too)

[Updated] Or you can run this batch file. [/Updated]

If the procedure didn’t fix the problem you’ll notice that the xml files are updated (timestamp) every few minutes coinciding with the CPU spikes.

Second shot: Digging in (Using the fusion log)

The core problem in my case was that some .NET class/assembly could not be loaded as the message “Unable to load one or more of the requested types. Retrieve the LoaderExceptions property for more information” strongly hints. It may be a little surprising how many times an assembly, completely unrelated to the task at hand (e.g. deployment), is actually loaded. To fix the problem all we have to do is identify the assembly and “make it available”.

Enabling Fusion Log

You need to debug the failed assembly bindings you need to look into the fusion log. Fusion is an ancient (.Net 1.0) codename for the GAC or assembly loader (I think).

To enable the log you need to add/change three keys in the registry:

HKLM\Software\Microsoft\Fusion\LogPath    REG_SZ    (path to local directory)

HKLM\Software\Microsoft\Fusion\LogFailues    DWORD     1

HKLM\Software\Microsoft\Fusion\EnableLog    DWORD    1

It is not strictly necessary to restart anything, but I recommend that you now restart your timer service in order for it to log any binding errors that occurs. Try starting it a couple of times with 10 minutes in between. That should reveal any binding error patterns.

(Refer to Brad Adams for more info)

Interpreting the Log

Finally we will use the “fuslogvw.exe” program that is part of the .NET SDK to view the actual logs. Your development machine will have this file; copy the executable to your server.

It is not a very good program. It gets the job done, but it’s hard to figure out, you can’t order errors by date you can’t resize the window, etc..

Hopefully your window will contain a lot less entries (in my case the offending entry is the highlighted one. It also failed with similar errors from the psconfig wizard and powershell).

Look for patterns using the timestamps. Did you get a group of binding errors a few minutes after you started the timer service? Or are there just some errors that look spurious? I’ll recommend skipping the core internal microsoft dll’s to begin with (msvcm80 is the c runtime library version 8 that .Net 2.0 uses).

So what goes into the log? Every assembly binding failure, which is (at least):

  • If an assembly cannot be located (it will show you where it searched and any assembly redirects)
  • If the dll load methods within the assembly throws an exception – e.g. a dependent assemly could not be found – it sadly looks exactly like the dll file could not be found (other exceptions are possible but unlikely for .NET people). The one way to distinguish is to look for the line “LOG: GAC Lookup was unsuccessful.” if it is not there then it was (probably) found in the GAC and a dependent assembly failed.

The last bullet means that if assembly A depends on B and B could not be found then both the binding for A and B fails (at the same time). To distinguish the two I’ll recommend that you look at the “Calling Assembly” in the bind log.

In my case:

  1. The Nintex.Workflow dll failed to load and the calling assembly was Microsoft.SharePoint
  2. The Nintex.Workflow.Charting.dll failed to load and the calling assembly was Nintex.Workflow. Aha, so the Nintex.Workflow.dll was actually found but failed to find the dependent Nintex.Workflow.Charting assembly.
  3. Found the assembly on a development machine and copied it to the server. Retried. Added another missing assembly
  4. And everything worked!

I should stress that neither the error type nor the troubleshooting is Nintex workflow specific.

The bind log failure from step 2 above was:

*** Assembly Binder Log Entry (1/13/2009 @ 4:34:15 PM) ***

The operation failed.

Bind result: hr = 0x80070002. The system cannot find the file specified.

Assembly manager loaded from: C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\mscorwks.dll

Running under executable C:\Program Files\Common Files\Microsoft Shared\web server extensions\12\BIN\STSADM.EXE

— A detailed error log follows.

=== Pre-bind state information ===

LOG: User = ….

LOG: DisplayName = Nintex.Charting, Version=, Culture=neutral, PublicKeyToken=913f6bae0ca5ae12


LOG: Appbase = file:///C:/Program Files/Common Files/Microsoft Shared/web server extensions/12/BIN/

LOG: Initial PrivatePath = NULL

LOG: Dynamic Base = NULL

LOG: Cache Base = NULL

LOG: AppName = NULL

Calling assembly : Nintex.Workflow, Version=, Culture=neutral, PublicKeyToken=913f6bae0ca5ae12.


LOG: This bind starts in default load context.

LOG: No application configuration file found.

LOG: Using machine configuration file from C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\config\machine.config.

LOG: Post-policy reference: Nintex.Charting, Version=, Culture=neutral, PublicKeyToken=913f6bae0ca5ae12

LOG: GAC Lookup was unsuccessful.

LOG: Attempting download of new URL file:///C:/Program Files/Common Files/Microsoft Shared/web server extensions/12/BIN/Nintex.Charting.DLL.

LOG: Attempting download of new URL file:///C:/Program Files/Common Files/Microsoft Shared/web server extensions/12/BIN/Nintex.Charting/Nintex.Charting.DLL.

LOG: Attempting download of new URL file:///C:/Program Files/Common Files/Microsoft Shared/web server extensions/12/BIN/Nintex.Charting.EXE.

LOG: Attempting download of new URL file:///C:/Program Files/Common Files/Microsoft Shared/web server extensions/12/BIN/Nintex.Charting/Nintex.Charting.EXE.

LOG: All probing URLs attempted and failed.

Note: To grab a dll from the GAC on one machine you need to use a shell (cmd, bash or powershell) to go into the c:\windows\assembly\gac_msil\… folder structure and copy it.

Why did it go wrong in the First Place?

It’s always a useful exercise to figure out what went wrong and why instead of just fixing the error at hand.

The root cause of the error was that the dll’s in questions were distributed through a wsp solution file which was referenced in other code. When that particular solution was undeployed the dll’s were removed (from the GAC) and the new updated wsp file only contained the new versions. Suddenly something went missing 😦

It can also happen if you use some auto wsp packaging procedures like vbs scripts or the ubiquitous WSPBuilder. What if one of the SharePoint system dll’s are marked as “copy local” on one (of the others’) local develop environment and automatically included in the wsp file? It will be deployed, no problems, but when undeployed everything stops as the SharePoint system dll just got removed from the GAC. Oops. Prevention is obviously to use a build server to make clean builds, to educate your people and to have the means to troubleshoot when it didn’t work out.

Automatic Configuration of Asp.NET Ajax Extension 1.0

I’m currently testing the world famous RadEditor for MOSS from Telerik and come across a rather annoying deployment issue. Don’t get me wrong the RadEditor is a fantastic product; however it relies on the Asp.NET Ajax Extension 1.0 from Microsoft that really is a royal pain to install and configure.

Installation of the Ajax extension is fairly straight forward, just run through the installer.

Configuring a given website (here it’s a SharePoint site with a lengthy web.config file to begin with) is a rather cumbersome task, described in detail at Mike Ammerlaan’s nice blog post. In total there are about 20 additional tags that need to be inserted into your web.config file in just the right places.

It’s a tedious and error prone process that I’m unwilling to go through with 10+ websites replicated in 2 different environments. Googling for an hour revealed no real solution; it seems that no one has properly automated the process.

I’ve used a small tool ConfigMerge with bit of powershell to do the magic. I can now just run a script that will deploy the configuration settings to all my web.config files at my SharePoint sites (have to run it on every server) 🙂

Read on…

Web.Config Modifications

I’ve gathered the required modifications in a xml file that resembles the structure of a “normal” web.config file, which are then merged into the real web.config files. The file is “ajax35.config” in the downloadable zip file.

Please note that the following applies to Asp.NET Ajax Extensions 1.0 for a .NET 3.5 web sites, slight variations are likely for 2.0 and 3.0 sites.

Auto Configuration

[Updated 23-12-2008]

The key component in the auto configuration is the ConfigMerge program found on CodeProject that will merge ajax config file with the existing web.config file. If the nodes are already there they will be updated, not duplicated.

ConfigMerge uses a small list of attributes to identify similar nodes which I’ve had to extend just a bit for our needs (look for my comment on the CodeProject page near the bottom). The point is that it is now idempotent – if you run it more than once, no changes are performed.
I’ve included both the new binary and the modified source in the zip package.

The usage of the ConfigMerge utility is:

ConfigMerge.exe ExistingConfigFile ChangesConfigFile OutputFile

However I prefer to use powershell to call it, and perform (necessary) backup of web.config.

Note: If you don’t like powershell stop reading now and just use the above line in your own bat files and you’ll be good.


To get started with powershell read my old post, though this time I’m actually not accessing the SharePoint API.

Usage for the EnableAjax.ps1 script is (from within powershell):

EnableAjax.ps1 WebConfigPath [AjaxConfigFile]

Alternatively from a batch file/cmd use:

Powershell –command EnableAjax.ps1 WebConfigPath [AjaxConfigFile]

WebConfigPath is the path where the script should search for web.config files. It’ll go recursively down from the specified directory. If you only want to work on a single web application then use the root dir for that web application, if you want to run on all SharePoint applications you can use something like “c:\inetpub\wwwroot\wss\virtualdirectories”.

AjaxConfigFile is the name of the config file you want to merge in. It defaults to “ajax35.config” in the same dir as EnableAjax.ps1.

When the script is about to fix a web.config file it performs a backup first named with the current time, e.g. “web_ajax_2008_12_21 21_06_09.backup”.

Finally here is the actual script (EnableAjax.ps1):

# Apply Asp.NET Ajax Extensions to a number of web.config files easily.
# 21/12-2008 Søren L. Nielsen (
$ajaxconfig = "./ajax35.config"
$path = "."
if( $args.length -eq 2 ){
    $path = $args[0]
    $ajaxconfig = $args[1]
elseif ( $args.length -eq 1 ){
    $path = $args[0]
else {
    Write "Usage: ./EnableAjax.ps1 [WebConfigPath [AjaxConfigFile]]"
    Write ""
    Write "WebConfigPath - Path where the script should look for web.config files"
    Write "                Script will search for web.config files recursively and "
    write "                modify all it finds."
    Write ""
    Write ("AjaxConfigFile - Default is " + $ajaxconfig + " set it to a xml config")
    Write "                 file with the same structure as a 'normal' web.config file"
    Write "                 but only containing the nodes that should be added/updated"
    Write "                 in the web config files."
    Write "                 The script is build for Asp.Net Ajax extensions it will however"
    Write "                 be useful in many settings, e.g. manipulating dev, QA, Prod "
    Write "                 environment settings."
write ("Modifying web.configs found in " + $args[0])
foreach( $f in Get-ChildItem $args[0] -filter web.config -recurse ){
    $realname = $f.FullName
    $backup = $f.FullName.ToLower().Replace(".config", "_ajax_" +
[DateTime]::Now.ToString("yyyy_MM_dd HH_mm_ss") + ".backup"
    Write ("Updating " + $f.FullName )
    Write ("Backup file " + $backup )
    $f.MoveTo( $backup )
    ./ConfigMerge.exe $backup $ajaxconfig $realname

As the web.config is fairly critical for your application you should be thorough when you test this script, I’ll recommend using WinMerge when comparing before and after config files.

Or you could just trust me an know that I take absolutely no responsibilty for any harm that comes your way 😉

[Updated 23-12-2009] Note: You need to run the script for every server, however if you use UNC paths as input to the script you can run it for every server from just one. I’ll recommend that you only install powershell on your backend servers not your frontends (if you’re on Server 2003).

Note 2: You need to rerun the script if you provision new web applications, or if you add a new server to the farm (on that server). The web.config modifications performed by SharePoint should not interfere with the script.

Download (Source, Scripts and Binaries)

Auto Asp Net Ajax Config

Final Notes

Please note that the files provided here only configures Asp.NET Ajax Extentions 1.0; it does install or configure the telerik RadEditor. It requires two more keys documented in the install guide (and a number of other small steps), that you can easily add through the SharePoint API (WebConfigModifications).

I should stress again that the version of ConfigMerge used here has been modified a bit. If you use the unmodified version you risk that some of the existing tags are “reused”/changed for the Ajax configuration and then SharePoint might well be in trouble.

”Not enough storage” event log error

[Note: Updated Feb. 22 2008, solution at the bottom] 

I’m responsible for a couple of SharePoint 2007 (MOSS) farms where all SharePoint servers showed a number of annoying errors in the application event log.

Every minute the following three errors show up in the event log:

Event Type: Error
Event Source: Windows SharePoint Services 3
Event Category: Timer
Event ID: 6398
Date: 3/6/2007
Time: 11:47:58 AM
User: N/A
The Execute method of job definition Microsoft.Office.Server.Administration.ApplicationServerAdministrationServiceJob (ID 371548ff-a05e-41f0-90da-6f2d25fbb483) threw an exception. More information is included below.

Not enough storage is available to process this command.

For more information, see Help and Support Center at


Event Type: Error
Event Source: Office SharePoint Server
Event Category: Office Server Shared Services
Event ID: 7076
Date: 3/6/2007
Time: 11:47:58 AM
User: N/A
An exception occurred while executing the Application Server Administration job.

Message: Not enough storage is available to process this command.
Techinal Support Details:
System.Runtime.InteropServices.COMException (0x80070008): Not enough storage is available to process this command.

Server stack trace:
at System.DirectoryServices.DirectoryEntry.Bind(Boolean throwIfFail)
at System.DirectoryServices.DirectoryEntry.Bind()
at System.DirectoryServices.DirectoryEntry.get_IsContainer()
at System.DirectoryServices.DirectoryEntries.CheckIsContainer()
at System.DirectoryServices.DirectoryEntries.Find(String name, String schemaClassName)
at Microsoft.SharePoint.AdministrationOperation.Metabase.MetabaseObjectCollection`1.Find(String name)
at Microsoft.SharePoint.AdministrationOperation.Metabase.MetabaseObjectCollection`1.get_Item(String name)
at Microsoft.SharePoint.AdministrationOperation.SPProvisioningAssistant.ProvisionIisApplicationPool(String name, ApplicationPoolIdentityType identityType, String userName, SecureString password, TimeSpan idleTimeout, TimeSpan periodicRestartTime)
at Microsoft.SharePoint.AdministrationOperation.SPAdministrationOperation.DoProvisionIisApplicationPool(String name, Int32 identityType, String userName, String password, TimeSpan idleTimeout, TimeSpan periodicRestartTime)
at System.Runtime.Remoting.Messaging.StackBuilderSink._PrivateProcessMessage(IntPtr md, Object[] args, Object server, Int32 methodPtr, Boolean fExecuteInContext, Object[]& outArgs)
at System.Runtime.Remoting.Messaging.StackBuilderSink.PrivateProcessMessage(RuntimeMethodHandle md, Object[] args, Object server, Int32 methodPtr, Boolean fExecuteInContext, Object[]& outArgs)
at System.Runtime.Remoting.Messaging.StackBuilderSink.SyncProcessMessage(IMessage msg, Int32 methodPtr, Boolean fExecuteInContext)


Event Type: Error
Event Source: Office SharePoint Server
Event Category: Office Server Shared Services
Event ID: 7076
Date: 3/6/2007
Time: 11:47:58 AM
User: N/A
An exception occurred while executing the Application Server Administration job.

Message: Not enough storage is available to process this command.

Techinal Support Details:
System.Runtime.InteropServices.COMException (0x80070008): Not enough storage is available to process this command.

Server stack trace:

Sometimes they are replaced with another three that has the same event id and source, but with the error message being “Old format or invalid type library” instead. The stack trace will differ a little bit.

Obviously it’s a timer job scheduled every minute that fails. The one in question is “Application Server Administration Service Timer Job”, which apparently is in charge of ensuring that the IIS application pools are in sync (or something like that). Nothing seems to be broken in the farm by the job failures.

And quite annoyingly: They all disappear when you reboot the server and will not reappear until after “a few days” in my case. I’m sure that the re-surface time will differ between systems.

I’ve read many proposed solutions for this error, including adding ram, disc etc., often claimed to work. I seriously doubt that any of these solutions actually work. My SharePoint servers are equipped with no less than 6gb of ram with default settings for all the application pools (I know that the ram is hardly utilized with these settings, but in my world it’s sometimes cheaper to go for a “standard server” where you only utilize 70% than one customized for your needs), plenty of disc etc.

As a side note you can also find references to this error in connection with SQL server 2005, the fix below possibly also work for that as well.

The solution turns out to be quite easy – the patch you are looking for is kb923028. It is an update for an error in the .NET 2.0 remoting subsystem, and has actually nothing to do with SharePoint at all. Reading the description it is quite hard to glean that it’ll solve your problem. MS support pointed me to it and it seems to work just fine.

Caveat: I’ve seen multiple versions of this file, the one I have working (until proven wrong) is “NDP20-KB923028-X86.exe” (1,936,224 bytes). I’ve tested another one with the same name with a filesize of 1,963,440 bytes that didn’t work.

The server has been chugging along for some time now (about a week) without the bug so let’s hope it’s all done.

That didn’t work…

Update: Finally a resolution

You need to look at hotfix KB946517, which will fix the problem. It is a private hotfix, so you’ll need to contact MS acquire it.

The servers have been running for about a week now, they are still ok and other people are also reporting success.

For the third time I’m confident that the problem has been solved – guess I don’t learn from experience 😉

Propagate Site Content Types to List Content Types

[Aka: Make Content Type Inheritance Work for XML Based Content types]

[Note:Updated Nov 6]

If you are working with XML based content types, you’ll sooner or later fall into a trap well hidden by Microsoft. Simply put Content Type inheritance don’t work for XML based content types. SharePoint does not check what fields you’ve added/deleted/changed since you last deployed your content type feature, so you don’t have the “luxury” (it bloody well should work!) of choosing whether or not to propagate changes down to inherited content types as in the web interface or object model.

I found MS mention it briefly here.

The Role of List Content Types

I’ll take a small detour here to explain the concept of list content types. Whenever you assign a content type to a list (e.g. the pages document library for publishing webs) a new list content type is created which inherits from the site content type that you are assigning to the list and with the same name.

Actually the inheritance copies the site content type definition to the list content type in its entirety and assigns an “inherited” content type id to the new list content type.

If you later modify the site content type, through the web or object model, you have the option of propagating those changes to inherited content types, which in particular includes all the list content types on the site.

Don’t Disconnect the Content Type

Another small detour before I’ll present what to do about this mess. It should be stressed that any modification of the site content type through the web or object model, will disconnect the site content type from the underlying xml metadata file. If it happens you my reestablish the link as described here. This is very bad news. It not only means that you should be careful about what you do to your content types, but it also means that there is no simple way to propagate the changes code wise – we can’t just update the site content type (re-add fields) and have the changes propagate through the object model. 😦

The Code to Propagate

Finally I’m ready to present the code I used to propagate changes to my content types. 🙂

The procedure is:

  1. Locate the site content type in question (run it multiple times if needed)
  2. Start at the root web site
    1. Go through all lists on the web
    2. If the list is associated with the site content type (actually the inherited list content type)
      1. Compare every field on the site content type with the list content type
      2. Add, remove or change the field in question on the list content type
    3. Go recursively through every subweb and continue from step a

I created the following code as a new extension to stsadm, called like this:

stsadm -o cbPropagateContentType -url <site collection url> -contenttype <contenttype name> [-verbose] [-removefields] [-updatefields]

For instance:

stsadm -o cbPropagateContentType -url http://localhost -contenttype “My Article Page” –verbose

The “removefields” switch specifies whether or not fields found in the list content type that are not in the site content type should be removed or not. Default is not. New fields in the site content type will always be added to the list content types.

Note: I’ve not tried to create an “update” field. I’ve not had a use for that yet and it will also require considerably more testing to ensure that it works correctly. You really don’t want to break your list content types in case of errors…

Updated (Nov 6): An update option has now been added you should consider that option as beta.

Note 2: If the job stops prematurely no harm is done. Restart it and it will continue from where it stopped – it will examine and skip the webs/lists that it already has processed.

Finally something tangible:

Save the following as “stsadmcommands.CBPropagateContentType.xml” and save it into “\Config” in the root of the SharePoint install folder (remember to update the assembly reference to whatever you compile the code into):

<?xml version="1.0" encoding="utf-8" ?>
  <command name="cbpropagatecontenttype"
          Carlsberg.SharePoint.Administration, Version=, Culture=neutral,  

And finally the code you need to compile to an assembly that the xml file should specify:

Updated (Nov 6): Code has been updated a bit. Some mistakes with display name/internal name have been fixed and the update option has been added. I’m not yet satisfied with the testing of the update method so consider it to be beta.

using System;
using System.Collections.Generic;
using System.Collections.Specialized;
using System.Text;
using Microsoft.SharePoint;
using Microsoft.SharePoint.StsAdmin;

namespace Carlsberg.SharePoint.Administration.STSAdm
    /// A custom STSAdm command for propagating site content types to lists
    /// content types.
    /// The code is provided as is, I don't take any responsibilty for 
    /// any errors or data loss you might encounter.
    /// Use freely with two conditions:
    /// 1. Keep my name in there
    /// 2. Report any bugs back to
    /// Enjoy
    /// Søren L. Nielsen
    class CBPropagateContentType : ISPStsadmCommand
        #region Input parameters
        private string providedUrl;
        private string contentTypeName;
        private bool removeFields = false;
        private bool verbose = false;
        private bool updateFields = false;

        private bool UpdateFields
            get { return updateFields; }
            set { updateFields = value; }

        private bool Verbose {
            get { return verbose; }
            set { verbose = value; }

        private bool RemoveFields {
            get { return removeFields; }
            set { removeFields = value; }

        private string ContentTypeName {
            get { return contentTypeName; }
            set { contentTypeName = value; }

        private string ProvidedUrl {
            get { return providedUrl; }
            set { providedUrl = value; }

        /// Runs the specified command. Called by STSADM.
        /// The command.
        /// The key values.
        /// The output.
        public int Run(string command, StringDictionary keyValues, 
                       out string output) {
            //Parse input                        
            // make sure all settings are valid
            if (!GetSettings(keyValues)) {
                output = "Required parameters not supplied or invalid.";

            SPSite siteCollection = null;
            SPWeb rootWeb = null;

            try {
                // get the site collection specified
                siteCollection = new SPSite(ProvidedUrl);
                rootWeb = siteCollection.RootWeb;

                //Get the source site content type
                SPContentType sourceCT = 
                if (sourceCT == null) {
                    throw new ArgumentException("Unable to find " 
                        + "contenttype named \"" + ContentTypeName + "\"");

                // process the root website
                ProcessWeb(rootWeb, sourceCT);

                output = "Operation successfully completed.";
                Log( output, false );
                return 0;
            catch (Exception ex) {
                output = "Unhandled error occured: " + ex.Message;
                Log(output, false);
                return -1;
            finally {
                if (rootWeb != null) {
                if (siteCollection != null) {

        /// Go through a web, all lists and sync with the source content 
        /// type.
        /// Go recursively through all sub webs.
        private void ProcessWeb(SPWeb web, SPContentType sourceCT) {
            //Do work on lists on this web
            Log("Processing web: " + web.Url);

            //Grab the lists first, to avoid messing up an enumeration 
            // while looping through it.
            List lists = new List();
            foreach (SPList list in web.Lists) {

            foreach (Guid listId in lists) {
                SPList list = web.Lists[listId];

                if (list.ContentTypesEnabled) {
                    Log("Processing list: " + list.ParentWebUrl + "/" 
                         + list.Title);

                    SPContentType listCT = 
                    if (listCT != null) {
                        Log("Processing content type on list:" + list);

                        if (UpdateFields) {
                          UpdateListFields(list, listCT, sourceCT);

                        //Find/add the fields to add
                        foreach (SPFieldLink sourceFieldLink in 
                                               sourceCT.FieldLinks) {
                          if (!FieldExist(sourceCT, sourceFieldLink)) {
                              "Failed to add field " 
                              + sourceFieldLink.DisplayName + " on list " 
                              + list.ParentWebUrl + "/" + list.Title 
                              + " field does not exist (in .Fields[]) on " 
                              + "source content type", false);
                          else {
                            if (!FieldExist(listCT, sourceFieldLink)) {
                              //Perform double update, just to be safe 
                              // (but slow)
                              Log("Adding field \"" 
                                 + sourceFieldLink.DisplayName 
                                 + "\" to contenttype on " 
                                 + list.ParentWebUrl + "/" + list.Title, 
                              if (listCT.FieldLinks[sourceFieldLink.Id] 
                                                                != null) {
                              listCT.FieldLinks.Add(new SPFieldLink(

                      if (RemoveFields) {
                            //Find the fields to delete
                            //WARNING: this part of the code has not been 
                            // adequately tested (though
                            // what could go wrong? ;-) ... )

                            //Copy collection to avoid modifying enumeration
                            // as we go through it
                            List listFieldLinks = 
                                                  new List();
                            foreach (SPFieldLink listFieldLink in 
                                                     listCT.FieldLinks) {

                            foreach (SPFieldLink listFieldLink in 
                                                        listFieldLinks) {
                                if (!FieldExist(sourceCT, listFieldLink)) {
                                    Log("Removing field \"" 
                                       + listFieldLink.DisplayName 
                                       + "\" from contenttype on :" 
                                       + list.ParentWebUrl + "/" 
                                       + list.Title, false);

            //Process sub webs
            foreach (SPWeb subWeb in web.Webs) {
                ProcessWeb(subWeb, sourceCT);

      /// Updates the fields of the list content type (listCT) with the 
      /// fields found on the source content type (courceCT).
      private void UpdateListFields(SPList list, SPContentType listCT, 
                                    SPContentType sourceCT) {
        Log("Starting to update fields ", false);
        foreach (SPFieldLink sourceFieldLink in sourceCT.FieldLinks) {
          //has the field changed? If not, continue.
          if (listCT.FieldLinks[sourceFieldLink.Id]!= null 
               && listCT.FieldLinks[sourceFieldLink.Id].SchemaXml 
                  == sourceFieldLink.SchemaXml) {
            Log("Doing nothing to field \"" + sourceFieldLink.Name 
                + "\" from contenttype on :" + list.ParentWebUrl + "/" 
                + list.Title, false);
          if (!FieldExist(sourceCT, sourceFieldLink)) {
              "Doing nothing to field: " + sourceFieldLink.DisplayName 
               + " on list " + list.ParentWebUrl 
               + "/" + list.Title + " field does not exist (in .Fields[])"
               + " on source content type", false);

          if (listCT.FieldLinks[sourceFieldLink.Id] != null) {

            Log("Deleting field \"" + sourceFieldLink.Name 
                + "\" from contenttype on :" + list.ParentWebUrl + "/" 
                + list.Title, false);


          Log("Adding field \"" + sourceFieldLink.Name 
              + "\" from contenttype on :" + list.ParentWebUrl 
              + "/" + list.Title, false);

          listCT.FieldLinks.Add(new SPFieldLink(
          //Set displayname, not set by previus operation
                      = sourceCT.FieldLinks[sourceFieldLink.Id].DisplayName;
          Log("Done updating fields ");

      private static bool FieldExist(SPContentType contentType, 
                                                     SPFieldLink fieldLink)
                //will throw exception on missing fields
                return contentType.Fields[fieldLink.Id] != null;
                return false;

        private void Log(string str, bool verboseLevel) {
            if (Verbose || !verboseLevel) {

        private void Log(string str) {
            Log(str, true);

        /// Parse the input settings
        private bool GetSettings(StringDictionary keyValues) {
            try {
                ProvidedUrl = keyValues["url"];
                //test the url
                new Uri(ProvidedUrl);

                ContentTypeName = keyValues["contenttype"];
                if (string.IsNullOrEmpty(ContentTypeName)) {
                    throw new ArgumentException("contenttype missing");

                if (keyValues.ContainsKey("removefields")) {
                    RemoveFields = true;

                if (keyValues.ContainsKey("verbose")) {
                    Verbose = true;

                if (keyValues.ContainsKey("updatefields"))
                    UpdateFields = true;
                return true;
            catch (Exception ex) {
                Console.Out.WriteLine("An error occuring in retrieving the"
                    + " parameters. \r\n(" + ex + ")\r\n");
                return false;

        /// Output help to console
        public string GetHelpMessage(string command) {
            StringBuilder helpMessage = new StringBuilder();

            // syntax
            helpMessage.AppendFormat("\tstsadm -o {0}{1}{1}", command, 
            helpMessage.Append("\t-url " + Environment.NewLine);
            helpMessage.Append("\t-contenttype " + Environment.NewLine);
            helpMessage.Append("\t[-removefields]" + Environment.NewLine);
            helpMessage.Append("\t[-updatefields]" + Environment.NewLine);
            helpMessage.Append("\t[-verbose]" + Environment.NewLine);

            // description
            helpMessage.AppendFormat("{0}This action will propagate a site"
                + " content type to all list content types within the "
                + "site collection.{0}Information propagated is field "
                + "addition/removal.{0}{0}", Environment.NewLine);
            helpMessage.AppendFormat("{0}Søren Nielsen (soerennielsen." 
                + "{0}{0}", Environment.NewLine);

            return helpMessage.ToString();

Final Comments

I’ve made zero attempts to optimize the code. It doesn’t really matter how long it takes, does it? Give it 10 minutes till a couple of hours for huge site collection (I’ve tested with about 400 sub sites).

I recommend that you use the verbose flag and pipe the output to a file, so that you can review that it did everything correctly.

The code does not handle site content types on sub sites I’ll probably add it fairly soon if I need it or time permits (does it ever?)


Use the above code freely, with two conditions:

  1. Leave my name and link in there 😉
  2. Report bug and improvements back to me

Convert “virtual” content types to “physical”

What do you do, if you in a fit of madness/desperation/stupidity created the content types used throughout your site, through the web interface and you now want to do the “right” thing and place them in xml files packaged as a feature?

Well this is description on how to convert the existing “virtual” content type to that xml file, while maintaining the integrity of your existing site and content. Warning: I’m modifying the SharePoint content database directly – use at your own risk!

The basic idea:

  1. Create a content type xml file and package it in a feature (don’t deploy it yet) as you would if you started in a blank environment
  2. “Steal or copy” the content id for “virtual” content type from the database and use it in your xml files. In other words the existing content id that is used throughout your existing SharePoint database in the inheritance hierarchy, will remain unchanged
  3. Modify the database so that SharePoint sees your content type as being feature based instead of “database based”
  4. Deploy your new content type feature. You can now update that content type as if you had started it out xml based to begin with

It seems fairly straightforward doesn’t it? It actually is.


Information on creating xml based content type can be found here (and on many other sources), it’s really not that hard. Your deployment will be much easier after this.

Right about now would be a good time to do a backup of your content database 😉

Step 1: Steal the Content Type ID

Your content type will need a very specific ID that the SharePoint created for you when you created your new content type in the first place (either through the web frontend or API). It looks like “0x0101……” and will probably be a rather long string. You need to grab this id from the content database:

  1. Connect to the content database in question, probably named wss_content_XXXX (if you didn’t choose a database name the XXXX will be a guid)
  2. Execute the following query to find the right content typeselect ResourceDir, ContentTypeId, Definition

    from dbo.ContentTypes

    where ResourceDir like ‘%Article Page%’

    Obviously substitute your own content type name, note that the web interface might have appended some trailing numbers to the name, so you’ll have to do a “like” selection

  3. Copy the ContentTypeID and insert it into the xml file. You might also want to verifiy that the definition corresponds to your that in your xml file (or just copy it over)

Step 2: Connect the Content Type to the XML File

Now you need to go into the database and modify the ContentType table to make SharePoint see it as a feature based content type as opposed to those solely in the database.

  1. Connect to the content database again (you might just have kept the window open)
  2. Execute begin tran once, just to give you an undo option
  3. Execute the following SQL statementUpdate dbo.ContentTypes

    Set Size = 16, Definition = null, IsFromFeature = true

    where ContentTypeId = 0x010100C5…..

    It should only modify one row

  4. If the name “ResourceDir” has been mangled by the web interface, you might want to take the opportunity to fix that too now
  5. If you are satisfied with the update execute commit tran, otherwise rollback tran, do not forget this as you are locking the table for the duration (btw: Isn’t that a neat trick?)


I will not take any responsibility if you lose your databases, however I would like to know if you find flaws with the procedure 😉

If you have many environments this technique only works if they have the same content type id for the same type across the farms. They will have if you did a backup/restore or content deployment from one to the other. They won’t if you created them through the web on both servers. Then you either choose which one is the master of the content or you are out of luck.

Note that if you update/change the content type xml files at a later time, the changes will only apply to the site scoped content type, not the actual list content types that the system created for every list where the type is enabled. This is very bad news, but not to worry I’ll post the fix for that in a few days (give me a bit of time).

If you modify the content type through the web interface after deployment it will once again be disconnected from the xml source, and you’ll have to complete Step 2 (only) to reconnect it.

Tool for Deployment of SSP search settings

I recently had the dubious honor to transfer search settings from one SSP to another. Going through every managed property, content source, search scope etc. just wasn’t something I looked forward to. On top of that – in the near future I will have to do it again when we deploy another SharePoint site to production.

Searching the net I found a tool created by Sahil Malik that could create the managed properties for me (link), provided that you manually merged some xml dumps of crawled and managed properties. Thanks Sahil for that great start – I needed something more therefore this post.

I modified Sahils code to suit my additional needs. It took me two full days to complete and test the code and in the end I guess that about 30% of the code base is Sahils original code.

I now have a tool that can import/export content sources, crawled properties, managed properties and (shared) search scopes – and it works!

I designed the import procedures so that they create, or synchronize, the destination SSP search settings with the xml files given, but do not delete anything not in those files, i.e. it will synchronize/create all the managed properties in the input xml file but not tough the existing ones not mentioned in the input file.

Ok, here are the details for the various operation types. The order listed here is the order that they should be imported in a complete SSP import.

Content Sources

Type, name, schedules, start addresses etc. are all handled. As far as I know that is everything, I’ve not been able to test the Exchange and BDC content sources, but they should work.

If you are transferring settings between two servers you probably want to correct the search start addresses as they are likely wrong. I’ve not tried to do anything fancy with automatic recognition of the local farm address and the like as the risk of error is too great, I wanted to keep the focus on the SSP settings not the various sites and their access mappings etc. Sorry for that you can’t have everything.

There is an option to start – and wait – for a full crawl after the import (“-FullCrawl”). This will allow the indices to be built and crawled properties will automatically be added for the crawled content. This is the “normal” way to create crawled properties.

Currently the program will wait a maximum of two hours for the crawl to complete, it will probably be configurable in the future (if I need it).

Crawled Properties

It is possible to import as well as export these. I should stress that the import operation should be considered experimental.

Why would you want to import crawled properties? They are usually created by the crawler and are available for use in managed properties immediately afterwards. However if the content in question have not yet been created (e.g. you are deploying a site to a new farm) or if you don’t want to wait for a full crawl before you create the managed properties, you might want to import them.

I’m not really using this feature myself so I don’t consider my testing to be conclusive enough.

Managed Properties

The code to import and export managed properties is originally from Sahil Malik, though considerable redesigned and bug fixed. It is now possible to dump all managed properties from one site and import them to another – there is no need to extract the standard system managed properties from your own custom (you are welcome if you want to), all can be imported with no changes.

The import will fail if one of the managed properties maps to an unknown crawled property, then you might need to either schedule a full crawl to create the crawl properties or import them too.

The “remove excess mappings” option (“-RemoveExcessMappings”)can be used to delete mappings from existing managed properties to crawled when those properties exists in the input xml file with other mappings, i.e. using this option will ensure that the SSP managed properties are exactly the same as those in the xml file after the import.

Search Scopes

The shared search scopes (those defined in the SSP) are fully supported – settings and rules are all transferred. The import will prune the scope rules to match the import xml file.

The import will fail for scopes that use property rules if the managed properties used has not been defined or marked for use in scopes (the “allow this property to be used in scopes” switch. Import of the managed property includes this setting).

The option “-StartCompilation” starts a scope compilation after the import but not wait for completion (not much point in waiting for that).

The one thing is missing from the scope import is scope display groups. They are of used on sites to populate the search scope dropdown (and some of my own search webparts as well) and are quite important for the end user search experience. You will have to set those yourself as I limited the scope (sorry for the pun) of the program to the setting stored in the SSP. Should be fairly easy for a site collection administrator to enter them however. In a similar vein any site specific search scopes are not handled. I don’t use that feature at all so there’s no support. Perhaps a topic for future improvement.

How to use

Usage: sspc.exe -o <operation> <target type> <parameters>

Operation = Export|Import

Target type = ContentSources|CrawledProperties|ManagedProperties|SearchScopes

Parameters = -url <ssp url> -file <input/output file name> [-FullCrawl|-RemoveExcessMappings|-StartCompilation]

Note all arguments are case insensitive.

This post is quite long enough as is so if you want to see the exact xml format needed download the code and run the export.

Sample Export

SSPC.exe -o export ContentSources -url http://moss:7000/ssp/admin -file output_contentsources.xml

SSPC.exe -o export CrawlProperties -url http://moss:7000/ssp/admin -file output_crawlproperties.xml

SSPC.exe -o export ManagedProperties -url http://moss:7000/ssp/admin -file output_managedproperties.xml

SSPC.exe -o export SearchScopes -url http://moss:7000/ssp/admin -file output_searchscopes.xml

I created a batch file for a full export (excluding crawled properties):

“Export SSP settings.bat” http://moss:7000/ssp/admin

which will create the output files “output_contentsources.xml”, “output_managedproperties.xml” and “output_searchscopes.xml”.

Sample Import

SSPC.exe -o import ContentSources -fullcrawl -url http://moss:7002/ssp/admin -file input_contentsources.xml

SSPC.exe -o import CrawlProperties -url http://moss:7002/ssp/admin -file input_crawlproperties.xml

SSPC.exe -o import ManagedProperties -removeexcessmappings -url http://moss:7002/ssp/admin -file input_managedproperties.xml

SSPC.exe -o import SearchScopes -startcompilation -url http://moss:7002/ssp/admin -file input_searchscopes.xml

The corresponding batch import file:

“Import SSP settings.bat” http://moss:7002/ssp/admin

which assumes the presence of input files “output_contentsources.xml”, “output_managedproperties.xml” and “output_searchscopes.xml” generated above.

Code Design Notes

Sahil Malik named the program SSPC (supposedly short for “Shared Services Provider Property Creation”) and the corresponding project name on the codeplex site is SSSPPC (“Sharepoint Shared Services Search Provider Property Creation”). It’s a mess and now that I’ve expanded the scope of the program considerably the name is even more misleading now.

Just to avoid further confusion I’ve refrained from renaming the program.

Sahil Malik spent some time doing a proper code design for the initial version. I personally think that he did go a bit over the top (sorry Sahil), but I’ve nevertheless retained most of the basic design.

He split up the code in a number of layers (we all love that) where each layer is a different class-library project. I kept that design and therefore the download will contain a number of dll files as well as the actual exe file. Just keep them all in the same directory and all should be well.

Some comments:

  • I did not change the naming of the existing projects (i.e. they are all named “Winsmarts.*” though I did change a lot of the code) but the ones I added are named “Carlsberg.*”
  • I redesigned/recoded the managed property import section as I simply hate duplicated code and deleted the duplicated BO classes that were present in the old “MAMC project” (now moved to “Winsmarts.SSPC.ManagedProperies”).
  • The import code is now always present in the same project that performs the export.
  • The managed property import/export is now complete in the sense that it can now export and import everything including the system properties. No need to sort through it all and find the ones you are responsible for (though it might still be a good idea to sift through and ensure that old test data are removed)
  • I renamed a number of the classes as some of the BO objects were named as their SharePoint counterparts and the code was quite a bit harder to read than it needed to be.
  • Version number of all (sub) projects has been changed to
  • Error handling is still pretty basic so you’ll get an exception with a stack trace in the console if anything is amiss


My code changes has now been merged into the main code base at the codeplex site. These changes breaks everything in the original code, so you will need to update xml and script files…

Future Improvements

This is the list of future improvements I’ve noted that might be added if I find the time and need for it.

  • [Updated: Done] The code could be cleaned up somewhat (there shouldn’t be any todo’s in released code)
  • Perhaps site scopes should be added
  • Scope display groups might be added (requires some connection from SSP to the sites)
  • It might make sense to add these commands to the list of operations supported by stsadm, which is fairly easy to do (see Andrew Connells excellent post for a sample)
  • [Updated: Done] I’m not too fond of the serialization classes – basically the same piece of code is copied four times with minimal changes. I always consider duplicated code as a bug



The code has now been merged with the existing code base at codeplex, so head over there for the latest download.



Sahil Maliks original post

The current Codeplex site

A couple of useful MS articles: Creating Content Sources to Crawl Business Data in SharePoint Server 2007 Enterprise Search and Creating and Exposing Search Scopes in SharePoint Server 2007 Enterprise Search

Fixing those pesky DCOM event log error 10016 in a SharePoint farm environment

I’m responsible for a couple of SharePoint 2007 (MOSS) farms where all SharePoint servers showed the following error in the system event log:

Event Type: Error
Event Source: DCOM
Event Category: None
Event ID: 10016
Date: 1/17/2007
Time: 4:31:48 AM
User: <DOMAIN>\sa_adm
Computer: <SERVER>

The application-specific permission settings do not grant Local Activation permission for the COM Server application with CLSID


to the user <DOMAIN>\sa_adm SID (S-1-5-21-162740987-2502514208-3469184634-1119). This security permission can be modified using the Component Services administrative tool.

For more information, see Help and Support Center at

The error would show up at regular intervals in clusters (4-12 at roughly the same time) and there would be a few more with other usernames and other class id’s. I had two fully functional farms with 3 SharePoint servers each and a number of standalone development machines. They all exhibited similar behavior.

The error listed above is that the user running the Central Administration web application doesn’t have access to activate (instantiate) the IIS WAMREG admin Service object (search the registry for the CLSID).

Strangely enough I didn’t observe any functional errors in the farms as a result of these errors – nothing seemed amiss (plenty of stuff didn’t work but none directly related to this).

An important note here is that the service users used in the farm are all standard domain accounts and only given additional local rights by the SharePoint installer and Central Administration (The one exception is that “aspnet_regiis -ga IIS_WPG” was executed after SharePoint install and initial configuration).

The following procedure removes the errors from the event log without compromising the entire security setup (yes, assign administrative rights for the service users would do the trick too) and has been verified by Microsoft consulting services.

On each SharePoint server do the following:

  1. Click Start, Click Run, type “dcomcnfg” and click ok
  2. Expand Component Services / Computers / My Computer / DCOM Config

  3. Right click IIS WAMREG admin Service and choose Properties
  4. Click the Security tag
  5. Click Edit under Launch and Activation Permissions

  6. Click Add
  7. In the Select Users, Computers or Groups type computername\WSS_WPG and

  8. Click ok
  9. In the Permissions for UserName list, click to select the Allow check box

  10. Click Ok twice.
  11. Go back to the main Component Services window, right click the “netman” node and select Properties
  12. Click the security tab
  13. Click Edit under Activation Permissions
  14. Click Add on the Launch Permissons Dialog
  15. Enter “NETWORK SERVICE” in the edit box
  16. Click Ok
  17. Enable all the checkboxes for the NETWORK SERVICE

  18. Click Ok twice
  19. Finally, run “IISReset”

That should be it!

A little less event log errors to worry about – there are plenty left on a reasonable complex SharePoint farm…

As a side note: The above error also shows up in other applications as well – I’ve heard about it for exchange servers as well and more applications are probably affected. In that case you’ll need to search the registry for the actual DCOM application and assign the rights to another local group (or username as a last resort).