Propagate Site Content Types to List Content Types


[Aka: Make Content Type Inheritance Work for XML Based Content types]

[Note:Updated Nov 6]

If you are working with XML based content types, you’ll sooner or later fall into a trap well hidden by Microsoft. Simply put Content Type inheritance don’t work for XML based content types. SharePoint does not check what fields you’ve added/deleted/changed since you last deployed your content type feature, so you don’t have the “luxury” (it bloody well should work!) of choosing whether or not to propagate changes down to inherited content types as in the web interface or object model.

I found MS mention it briefly here.

The Role of List Content Types

I’ll take a small detour here to explain the concept of list content types. Whenever you assign a content type to a list (e.g. the pages document library for publishing webs) a new list content type is created which inherits from the site content type that you are assigning to the list and with the same name.

Actually the inheritance copies the site content type definition to the list content type in its entirety and assigns an “inherited” content type id to the new list content type.

If you later modify the site content type, through the web or object model, you have the option of propagating those changes to inherited content types, which in particular includes all the list content types on the site.

Don’t Disconnect the Content Type

Another small detour before I’ll present what to do about this mess. It should be stressed that any modification of the site content type through the web or object model, will disconnect the site content type from the underlying xml metadata file. If it happens you my reestablish the link as described here. This is very bad news. It not only means that you should be careful about what you do to your content types, but it also means that there is no simple way to propagate the changes code wise – we can’t just update the site content type (re-add fields) and have the changes propagate through the object model. 😦

The Code to Propagate

Finally I’m ready to present the code I used to propagate changes to my content types. 🙂

The procedure is:

  1. Locate the site content type in question (run it multiple times if needed)
  2. Start at the root web site
    1. Go through all lists on the web
    2. If the list is associated with the site content type (actually the inherited list content type)
      1. Compare every field on the site content type with the list content type
      2. Add, remove or change the field in question on the list content type
    3. Go recursively through every subweb and continue from step a

I created the following code as a new extension to stsadm, called like this:

stsadm -o cbPropagateContentType -url <site collection url> -contenttype <contenttype name> [-verbose] [-removefields] [-updatefields]

For instance:

stsadm -o cbPropagateContentType -url http://localhost -contenttype “My Article Page” –verbose

The “removefields” switch specifies whether or not fields found in the list content type that are not in the site content type should be removed or not. Default is not. New fields in the site content type will always be added to the list content types.

Note: I’ve not tried to create an “update” field. I’ve not had a use for that yet and it will also require considerably more testing to ensure that it works correctly. You really don’t want to break your list content types in case of errors…

Updated (Nov 6): An update option has now been added you should consider that option as beta.

Note 2: If the job stops prematurely no harm is done. Restart it and it will continue from where it stopped – it will examine and skip the webs/lists that it already has processed.

Finally something tangible:

Save the following as “stsadmcommands.CBPropagateContentType.xml” and save it into “\Config” in the root of the SharePoint install folder (remember to update the assembly reference to whatever you compile the code into):

<?xml version="1.0" encoding="utf-8" ?>
<commands>
  <command name="cbpropagatecontenttype"
          class="Carlsberg.SharePoint.Administration.STSAdm.CBPropagateContentType,
          Carlsberg.SharePoint.Administration, Version=1.0.0.0, Culture=neutral,  
          PublicKeyToken=55c69d084ac6678f"/>
</commands>

And finally the code you need to compile to an assembly that the xml file should specify:

Updated (Nov 6): Code has been updated a bit. Some mistakes with display name/internal name have been fixed and the update option has been added. I’m not yet satisfied with the testing of the update method so consider it to be beta.

using System;
using System.Collections.Generic;
using System.Collections.Specialized;
using System.Text;
using Microsoft.SharePoint;
using Microsoft.SharePoint.StsAdmin;

namespace Carlsberg.SharePoint.Administration.STSAdm
{
    /// 
    /// A custom STSAdm command for propagating site content types to lists
    /// content types.
    /// 
    /// The code is provided as is, I don't take any responsibilty for 
    /// any errors or data loss you might encounter.
    /// 
    /// Use freely with two conditions:
    /// 1. Keep my name in there
    /// 2. Report any bugs back to https://soerennielsen.wordpress.com
    /// 
    /// Enjoy
    /// Søren L. Nielsen
    /// 
    /// 
    class CBPropagateContentType : ISPStsadmCommand
    {
        #region Input parameters
        private string providedUrl;
        private string contentTypeName;
        private bool removeFields = false;
        private bool verbose = false;
        private bool updateFields = false;

        private bool UpdateFields
        {
            get { return updateFields; }
            set { updateFields = value; }
        }

        private bool Verbose {
            get { return verbose; }
            set { verbose = value; }
        }

        private bool RemoveFields {
            get { return removeFields; }
            set { removeFields = value; }
        }

        private string ContentTypeName {
            get { return contentTypeName; }
            set { contentTypeName = value; }
        }

        private string ProvidedUrl {
            get { return providedUrl; }
            set { providedUrl = value; }
        }
        #endregion

        /// 
        /// Runs the specified command. Called by STSADM.
        /// 
        /// The command.
        /// The key values.
        /// The output.
        /// 
        public int Run(string command, StringDictionary keyValues, 
                       out string output) {
            //Parse input                        
            // make sure all settings are valid
            if (!GetSettings(keyValues)) {
                Console.Out.WriteLine(GetHelpMessage(string.Empty));
                output = "Required parameters not supplied or invalid.";
            }

            SPSite siteCollection = null;
            SPWeb rootWeb = null;

            try {
                // get the site collection specified
                siteCollection = new SPSite(ProvidedUrl);
                rootWeb = siteCollection.RootWeb;

                //Get the source site content type
                SPContentType sourceCT = 
                             rootWeb.AvailableContentTypes[ContentTypeName];
                if (sourceCT == null) {
                    throw new ArgumentException("Unable to find " 
                        + "contenttype named \"" + ContentTypeName + "\"");
                }

                // process the root website
                ProcessWeb(rootWeb, sourceCT);

                output = "Operation successfully completed.";
                Log( output, false );
                return 0;
            }
            catch (Exception ex) {
                output = "Unhandled error occured: " + ex.Message;
                Log(output, false);
                return -1;
            }
            finally {
                if (rootWeb != null) {
                    rootWeb.Dispose();
                }
                if (siteCollection != null) {
                    siteCollection.Dispose();
                }
            }
        }

        /// 
        /// Go through a web, all lists and sync with the source content 
        /// type.
        /// Go recursively through all sub webs.
        /// 
        /// 
        /// 
        private void ProcessWeb(SPWeb web, SPContentType sourceCT) {
            //Do work on lists on this web
            Log("Processing web: " + web.Url);

            //Grab the lists first, to avoid messing up an enumeration 
            // while looping through it.
            List lists = new List();
            foreach (SPList list in web.Lists) {
                lists.Add(list.ID);
            }

            foreach (Guid listId in lists) {
                SPList list = web.Lists[listId];

                if (list.ContentTypesEnabled) {
                    Log("Processing list: " + list.ParentWebUrl + "/" 
                         + list.Title);

                    SPContentType listCT = 
                                         list.ContentTypes[ContentTypeName];
                    if (listCT != null) {
                        Log("Processing content type on list:" + list);

                        if (UpdateFields) {
                          UpdateListFields(list, listCT, sourceCT);
                        }

                        //Find/add the fields to add
                        foreach (SPFieldLink sourceFieldLink in 
                                               sourceCT.FieldLinks) {
                          if (!FieldExist(sourceCT, sourceFieldLink)) {
                            Log(
                              "Failed to add field " 
                              + sourceFieldLink.DisplayName + " on list " 
                              + list.ParentWebUrl + "/" + list.Title 
                              + " field does not exist (in .Fields[]) on " 
                              + "source content type", false);
                          }
                          else {
                            if (!FieldExist(listCT, sourceFieldLink)) {
                              //Perform double update, just to be safe 
                              // (but slow)
                              Log("Adding field \"" 
                                 + sourceFieldLink.DisplayName 
                                 + "\" to contenttype on " 
                                 + list.ParentWebUrl + "/" + list.Title, 
                                   false);
                              if (listCT.FieldLinks[sourceFieldLink.Id] 
                                                                != null) {
                                listCT.FieldLinks.Delete(sourceFieldLink.Id);
                                listCT.Update();
                              }
                              listCT.FieldLinks.Add(new SPFieldLink(
                                      sourceCT.Fields[sourceFieldLink.Id]));
                              listCT.Update();
                            }
                          }
                        }


                      if (RemoveFields) {
                            //Find the fields to delete
                            //WARNING: this part of the code has not been 
                            // adequately tested (though
                            // what could go wrong? ;-) ... )

                            //Copy collection to avoid modifying enumeration
                            // as we go through it
                            List listFieldLinks = 
                                                  new List();
                            foreach (SPFieldLink listFieldLink in 
                                                     listCT.FieldLinks) {
                                listFieldLinks.Add(listFieldLink);
                            }

                            foreach (SPFieldLink listFieldLink in 
                                                        listFieldLinks) {
                                if (!FieldExist(sourceCT, listFieldLink)) {
                                    Log("Removing field \"" 
                                       + listFieldLink.DisplayName 
                                       + "\" from contenttype on :" 
                                       + list.ParentWebUrl + "/" 
                                       + list.Title, false);
                                    listCT.FieldLinks.Delete(
                                                        listFieldLink.Id);
                                    listCT.Update();
                                }
                            }
                        }
                    }
                }
            }


            //Process sub webs
            foreach (SPWeb subWeb in web.Webs) {
                ProcessWeb(subWeb, sourceCT);
                subWeb.Dispose();
            }
        }


      /// 
      /// Updates the fields of the list content type (listCT) with the 
      /// fields found on the source content type (courceCT).
      /// 
      /// 
      /// 
      /// 
      private void UpdateListFields(SPList list, SPContentType listCT, 
                                    SPContentType sourceCT) {
        Log("Starting to update fields ", false);
        foreach (SPFieldLink sourceFieldLink in sourceCT.FieldLinks) {
          //has the field changed? If not, continue.
          if (listCT.FieldLinks[sourceFieldLink.Id]!= null 
               && listCT.FieldLinks[sourceFieldLink.Id].SchemaXml 
                  == sourceFieldLink.SchemaXml) {
            Log("Doing nothing to field \"" + sourceFieldLink.Name 
                + "\" from contenttype on :" + list.ParentWebUrl + "/" 
                + list.Title, false);
            continue;
          }
          if (!FieldExist(sourceCT, sourceFieldLink)) {
            Log(
              "Doing nothing to field: " + sourceFieldLink.DisplayName 
               + " on list " + list.ParentWebUrl 
               + "/" + list.Title + " field does not exist (in .Fields[])"
               + " on source content type", false);
            continue;
                              
          }

          if (listCT.FieldLinks[sourceFieldLink.Id] != null) {

            Log("Deleting field \"" + sourceFieldLink.Name 
                + "\" from contenttype on :" + list.ParentWebUrl + "/" 
                + list.Title, false);

            listCT.FieldLinks.Delete(sourceFieldLink.Id);
            listCT.Update();
          }

          Log("Adding field \"" + sourceFieldLink.Name 
              + "\" from contenttype on :" + list.ParentWebUrl 
              + "/" + list.Title, false);

          listCT.FieldLinks.Add(new SPFieldLink(
                                     sourceCT.Fields[sourceFieldLink.Id]));
          //Set displayname, not set by previus operation
          listCT.FieldLinks[sourceFieldLink.Id].DisplayName 
                      = sourceCT.FieldLinks[sourceFieldLink.Id].DisplayName;
          listCT.Update();
          Log("Done updating fields ");
        }
      }

      private static bool FieldExist(SPContentType contentType, 
                                                     SPFieldLink fieldLink)
        {
            try
            {
                //will throw exception on missing fields
                return contentType.Fields[fieldLink.Id] != null;
            }
            catch(Exception)
            {
                return false;
            }
        }

        private void Log(string str, bool verboseLevel) {
            if (Verbose || !verboseLevel) {
                Console.WriteLine(str);
            }
        }

        private void Log(string str) {
            Log(str, true);
        }


        /// 
        /// Parse the input settings
        /// 
        /// 
        /// 
        private bool GetSettings(StringDictionary keyValues) {
            try {
                ProvidedUrl = keyValues["url"];
                //test the url
                new Uri(ProvidedUrl);

                ContentTypeName = keyValues["contenttype"];
                if (string.IsNullOrEmpty(ContentTypeName)) {
                    throw new ArgumentException("contenttype missing");
                }

                if (keyValues.ContainsKey("removefields")) {
                    RemoveFields = true;
                }

                if (keyValues.ContainsKey("verbose")) {
                    Verbose = true;
                }

                if (keyValues.ContainsKey("updatefields"))
                {
                    UpdateFields = true;
                }
                return true;
            }
            catch (Exception ex) {
                Console.Out.WriteLine("An error occuring in retrieving the"
                    + " parameters. \r\n(" + ex + ")\r\n");
                return false;
            }
        }

        /// 
        /// Output help to console
        /// 
        /// 
        /// 
        public string GetHelpMessage(string command) {
            StringBuilder helpMessage = new StringBuilder();

            // syntax
            helpMessage.AppendFormat("\tstsadm -o {0}{1}{1}", command, 
                                                 Environment.NewLine);
            helpMessage.Append("\t-url " + Environment.NewLine);
            helpMessage.Append("\t-contenttype " + Environment.NewLine);
            helpMessage.Append("\t[-removefields]" + Environment.NewLine);
            helpMessage.Append("\t[-updatefields]" + Environment.NewLine);
            helpMessage.Append("\t[-verbose]" + Environment.NewLine);

            // description
            helpMessage.AppendFormat("{0}This action will propagate a site"
                + " content type to all list content types within the "
                + "site collection.{0}Information propagated is field "
                + "addition/removal.{0}{0}", Environment.NewLine);
            helpMessage.AppendFormat("{0}Søren Nielsen (soerennielsen." 
                + "wordpress.com){0}{0}", Environment.NewLine);

            return helpMessage.ToString();
        }
    }
}

Final Comments

I’ve made zero attempts to optimize the code. It doesn’t really matter how long it takes, does it? Give it 10 minutes till a couple of hours for huge site collection (I’ve tested with about 400 sub sites).

I recommend that you use the verbose flag and pipe the output to a file, so that you can review that it did everything correctly.

The code does not handle site content types on sub sites I’ll probably add it fairly soon if I need it or time permits (does it ever?)

License

Use the above code freely, with two conditions:

  1. Leave my name and link in there 😉
  2. Report bug and improvements back to me

Convert “virtual” content types to “physical”


What do you do, if you in a fit of madness/desperation/stupidity created the content types used throughout your site, through the web interface and you now want to do the “right” thing and place them in xml files packaged as a feature?

Well this is description on how to convert the existing “virtual” content type to that xml file, while maintaining the integrity of your existing site and content. Warning: I’m modifying the SharePoint content database directly – use at your own risk!

The basic idea:

  1. Create a content type xml file and package it in a feature (don’t deploy it yet) as you would if you started in a blank environment
  2. “Steal or copy” the content id for “virtual” content type from the database and use it in your xml files. In other words the existing content id that is used throughout your existing SharePoint database in the inheritance hierarchy, will remain unchanged
  3. Modify the database so that SharePoint sees your content type as being feature based instead of “database based”
  4. Deploy your new content type feature. You can now update that content type as if you had started it out xml based to begin with

It seems fairly straightforward doesn’t it? It actually is.

Howto

Information on creating xml based content type can be found here (and on many other sources), it’s really not that hard. Your deployment will be much easier after this.

Right about now would be a good time to do a backup of your content database 😉

Step 1: Steal the Content Type ID

Your content type will need a very specific ID that the SharePoint created for you when you created your new content type in the first place (either through the web frontend or API). It looks like “0x0101……” and will probably be a rather long string. You need to grab this id from the content database:

  1. Connect to the content database in question, probably named wss_content_XXXX (if you didn’t choose a database name the XXXX will be a guid)
  2. Execute the following query to find the right content typeselect ResourceDir, ContentTypeId, Definition

    from dbo.ContentTypes

    where ResourceDir like ‘%Article Page%’

    Obviously substitute your own content type name, note that the web interface might have appended some trailing numbers to the name, so you’ll have to do a “like” selection

  3. Copy the ContentTypeID and insert it into the xml file. You might also want to verifiy that the definition corresponds to your that in your xml file (or just copy it over)

Step 2: Connect the Content Type to the XML File

Now you need to go into the database and modify the ContentType table to make SharePoint see it as a feature based content type as opposed to those solely in the database.

  1. Connect to the content database again (you might just have kept the window open)
  2. Execute begin tran once, just to give you an undo option
  3. Execute the following SQL statementUpdate dbo.ContentTypes

    Set Size = 16, Definition = null, IsFromFeature = true

    where ContentTypeId = 0x010100C5…..

    It should only modify one row

  4. If the name “ResourceDir” has been mangled by the web interface, you might want to take the opportunity to fix that too now
  5. If you are satisfied with the update execute commit tran, otherwise rollback tran, do not forget this as you are locking the table for the duration (btw: Isn’t that a neat trick?)

Caveats

I will not take any responsibility if you lose your databases, however I would like to know if you find flaws with the procedure 😉

If you have many environments this technique only works if they have the same content type id for the same type across the farms. They will have if you did a backup/restore or content deployment from one to the other. They won’t if you created them through the web on both servers. Then you either choose which one is the master of the content or you are out of luck.

Note that if you update/change the content type xml files at a later time, the changes will only apply to the site scoped content type, not the actual list content types that the system created for every list where the type is enabled. This is very bad news, but not to worry I’ll post the fix for that in a few days (give me a bit of time).

If you modify the content type through the web interface after deployment it will once again be disconnected from the xml source, and you’ll have to complete Step 2 (only) to reconnect it.

Tool for Deployment of SSP search settings


I recently had the dubious honor to transfer search settings from one SSP to another. Going through every managed property, content source, search scope etc. just wasn’t something I looked forward to. On top of that – in the near future I will have to do it again when we deploy another SharePoint site to production.

Searching the net I found a tool created by Sahil Malik that could create the managed properties for me (link), provided that you manually merged some xml dumps of crawled and managed properties. Thanks Sahil for that great start – I needed something more therefore this post.

I modified Sahils code to suit my additional needs. It took me two full days to complete and test the code and in the end I guess that about 30% of the code base is Sahils original code.

I now have a tool that can import/export content sources, crawled properties, managed properties and (shared) search scopes – and it works!

I designed the import procedures so that they create, or synchronize, the destination SSP search settings with the xml files given, but do not delete anything not in those files, i.e. it will synchronize/create all the managed properties in the input xml file but not tough the existing ones not mentioned in the input file.

Ok, here are the details for the various operation types. The order listed here is the order that they should be imported in a complete SSP import.

Content Sources

Type, name, schedules, start addresses etc. are all handled. As far as I know that is everything, I’ve not been able to test the Exchange and BDC content sources, but they should work.

If you are transferring settings between two servers you probably want to correct the search start addresses as they are likely wrong. I’ve not tried to do anything fancy with automatic recognition of the local farm address and the like as the risk of error is too great, I wanted to keep the focus on the SSP settings not the various sites and their access mappings etc. Sorry for that you can’t have everything.

There is an option to start – and wait – for a full crawl after the import (“-FullCrawl”). This will allow the indices to be built and crawled properties will automatically be added for the crawled content. This is the “normal” way to create crawled properties.

Currently the program will wait a maximum of two hours for the crawl to complete, it will probably be configurable in the future (if I need it).

Crawled Properties

It is possible to import as well as export these. I should stress that the import operation should be considered experimental.

Why would you want to import crawled properties? They are usually created by the crawler and are available for use in managed properties immediately afterwards. However if the content in question have not yet been created (e.g. you are deploying a site to a new farm) or if you don’t want to wait for a full crawl before you create the managed properties, you might want to import them.

I’m not really using this feature myself so I don’t consider my testing to be conclusive enough.

Managed Properties

The code to import and export managed properties is originally from Sahil Malik, though considerable redesigned and bug fixed. It is now possible to dump all managed properties from one site and import them to another – there is no need to extract the standard system managed properties from your own custom (you are welcome if you want to), all can be imported with no changes.

The import will fail if one of the managed properties maps to an unknown crawled property, then you might need to either schedule a full crawl to create the crawl properties or import them too.

The “remove excess mappings” option (“-RemoveExcessMappings”)can be used to delete mappings from existing managed properties to crawled when those properties exists in the input xml file with other mappings, i.e. using this option will ensure that the SSP managed properties are exactly the same as those in the xml file after the import.

Search Scopes

The shared search scopes (those defined in the SSP) are fully supported – settings and rules are all transferred. The import will prune the scope rules to match the import xml file.

The import will fail for scopes that use property rules if the managed properties used has not been defined or marked for use in scopes (the “allow this property to be used in scopes” switch. Import of the managed property includes this setting).

The option “-StartCompilation” starts a scope compilation after the import but not wait for completion (not much point in waiting for that).

The one thing is missing from the scope import is scope display groups. They are of used on sites to populate the search scope dropdown (and some of my own search webparts as well) and are quite important for the end user search experience. You will have to set those yourself as I limited the scope (sorry for the pun) of the program to the setting stored in the SSP. Should be fairly easy for a site collection administrator to enter them however. In a similar vein any site specific search scopes are not handled. I don’t use that feature at all so there’s no support. Perhaps a topic for future improvement.

How to use

Usage: sspc.exe -o <operation> <target type> <parameters>

Operation = Export|Import

Target type = ContentSources|CrawledProperties|ManagedProperties|SearchScopes

Parameters = -url <ssp url> -file <input/output file name> [-FullCrawl|-RemoveExcessMappings|-StartCompilation]

Note all arguments are case insensitive.

This post is quite long enough as is so if you want to see the exact xml format needed download the code and run the export.

Sample Export

SSPC.exe -o export ContentSources -url http://moss:7000/ssp/admin -file output_contentsources.xml

SSPC.exe -o export CrawlProperties -url http://moss:7000/ssp/admin -file output_crawlproperties.xml

SSPC.exe -o export ManagedProperties -url http://moss:7000/ssp/admin -file output_managedproperties.xml

SSPC.exe -o export SearchScopes -url http://moss:7000/ssp/admin -file output_searchscopes.xml

I created a batch file for a full export (excluding crawled properties):

“Export SSP settings.bat” http://moss:7000/ssp/admin

which will create the output files “output_contentsources.xml”, “output_managedproperties.xml” and “output_searchscopes.xml”.

Sample Import

SSPC.exe -o import ContentSources -fullcrawl -url http://moss:7002/ssp/admin -file input_contentsources.xml

SSPC.exe -o import CrawlProperties -url http://moss:7002/ssp/admin -file input_crawlproperties.xml

SSPC.exe -o import ManagedProperties -removeexcessmappings -url http://moss:7002/ssp/admin -file input_managedproperties.xml

SSPC.exe -o import SearchScopes -startcompilation -url http://moss:7002/ssp/admin -file input_searchscopes.xml

The corresponding batch import file:

“Import SSP settings.bat” http://moss:7002/ssp/admin

which assumes the presence of input files “output_contentsources.xml”, “output_managedproperties.xml” and “output_searchscopes.xml” generated above.

Code Design Notes

Sahil Malik named the program SSPC (supposedly short for “Shared Services Provider Property Creation”) and the corresponding project name on the codeplex site is SSSPPC (“Sharepoint Shared Services Search Provider Property Creation”). It’s a mess and now that I’ve expanded the scope of the program considerably the name is even more misleading now.

Just to avoid further confusion I’ve refrained from renaming the program.

Sahil Malik spent some time doing a proper code design for the initial version. I personally think that he did go a bit over the top (sorry Sahil), but I’ve nevertheless retained most of the basic design.

He split up the code in a number of layers (we all love that) where each layer is a different class-library project. I kept that design and therefore the download will contain a number of dll files as well as the actual exe file. Just keep them all in the same directory and all should be well.

Some comments:

  • I did not change the naming of the existing projects (i.e. they are all named “Winsmarts.*” though I did change a lot of the code) but the ones I added are named “Carlsberg.*”
  • I redesigned/recoded the managed property import section as I simply hate duplicated code and deleted the duplicated BO classes that were present in the old “MAMC project” (now moved to “Winsmarts.SSPC.ManagedProperies”).
  • The import code is now always present in the same project that performs the export.
  • The managed property import/export is now complete in the sense that it can now export and import everything including the system properties. No need to sort through it all and find the ones you are responsible for (though it might still be a good idea to sift through and ensure that old test data are removed)
  • I renamed a number of the classes as some of the BO objects were named as their SharePoint counterparts and the code was quite a bit harder to read than it needed to be.
  • Version number of all (sub) projects has been changed to 1.1.0.0.
  • Error handling is still pretty basic so you’ll get an exception with a stack trace in the console if anything is amiss

[Updated]

My code changes has now been merged into the main code base at the codeplex site. These changes breaks everything in the original code, so you will need to update xml and script files…

Future Improvements

This is the list of future improvements I’ve noted that might be added if I find the time and need for it.

  • [Updated: Done] The code could be cleaned up somewhat (there shouldn’t be any todo’s in released code)
  • Perhaps site scopes should be added
  • Scope display groups might be added (requires some connection from SSP to the sites)
  • It might make sense to add these commands to the list of operations supported by stsadm, which is fairly easy to do (see Andrew Connells excellent post for a sample)
  • [Updated: Done] I’m not too fond of the serialization classes – basically the same piece of code is copied four times with minimal changes. I always consider duplicated code as a bug

Downloads

[Updated]

The code has now been merged with the existing code base at codeplex, so head over there for the latest download.

Codeplex/SSSPPC

References

Sahil Maliks original post

The current Codeplex site

A couple of useful MS articles: Creating Content Sources to Crawl Business Data in SharePoint Server 2007 Enterprise Search and Creating and Exposing Search Scopes in SharePoint Server 2007 Enterprise Search

Fixing those pesky DCOM event log error 10016 in a SharePoint farm environment


I’m responsible for a couple of SharePoint 2007 (MOSS) farms where all SharePoint servers showed the following error in the system event log:


Event Type: Error
Event Source: DCOM
Event Category: None
Event ID: 10016
Date: 1/17/2007
Time: 4:31:48 AM
User: <DOMAIN>\sa_adm
Computer: <SERVER>
Description:

The application-specific permission settings do not grant Local Activation permission for the COM Server application with CLSID

{61738644-F196-11D0-9953-00C04FD919C1}

to the user <DOMAIN>\sa_adm SID (S-1-5-21-162740987-2502514208-3469184634-1119). This security permission can be modified using the Component Services administrative tool.

For more information, see Help and Support Center at http://go.microsoft.com/fwlink/events.asp.

The error would show up at regular intervals in clusters (4-12 at roughly the same time) and there would be a few more with other usernames and other class id’s. I had two fully functional farms with 3 SharePoint servers each and a number of standalone development machines. They all exhibited similar behavior.

The error listed above is that the user running the Central Administration web application doesn’t have access to activate (instantiate) the IIS WAMREG admin Service object (search the registry for the CLSID).

Strangely enough I didn’t observe any functional errors in the farms as a result of these errors – nothing seemed amiss (plenty of stuff didn’t work but none directly related to this).

An important note here is that the service users used in the farm are all standard domain accounts and only given additional local rights by the SharePoint installer and Central Administration (The one exception is that “aspnet_regiis -ga IIS_WPG” was executed after SharePoint install and initial configuration).

The following procedure removes the errors from the event log without compromising the entire security setup (yes, assign administrative rights for the service users would do the trick too) and has been verified by Microsoft consulting services.

On each SharePoint server do the following:

  1. Click Start, Click Run, type “dcomcnfg” and click ok
  2. Expand Component Services / Computers / My Computer / DCOM Config

  3. Right click IIS WAMREG admin Service and choose Properties
  4. Click the Security tag
  5. Click Edit under Launch and Activation Permissions

  6. Click Add
  7. In the Select Users, Computers or Groups type computername\WSS_WPG and
    computername\WSS_ADMIN_WPG

  8. Click ok
  9. In the Permissions for UserName list, click to select the Allow check box

  10. Click Ok twice.
  11. Go back to the main Component Services window, right click the “netman” node and select Properties
  12. Click the security tab
  13. Click Edit under Activation Permissions
  14. Click Add on the Launch Permissons Dialog
  15. Enter “NETWORK SERVICE” in the edit box
  16. Click Ok
  17. Enable all the checkboxes for the NETWORK SERVICE

  18. Click Ok twice
  19. Finally, run “IISReset”

That should be it!

A little less event log errors to worry about – there are plenty left on a reasonable complex SharePoint farm…

As a side note: The above error also shows up in other applications as well – I’ve heard about it for exchange servers as well and more applications are probably affected. In that case you’ll need to search the registry for the actual DCOM application and assign the rights to another local group (or username as a last resort).