Use CSOM from PowerShell!!!


The SharePoint 2010/2013 Client Object Model (CSOM) offers an alternative way to do basic read/write (CRUD) operations in SharePoint. I find that it is superior to the “normal” server object model/Cmdlets for

  1. Speed
  2. Memory
  3. Execution requirements – does not need to run on your SharePoint production server and does not need Shell access privileges. Essentially you can execute these kind of scripts with normal SharePoint privileges instead of sys admin privileges

And to be fair inferior in the type of operation you can be perform. It is essentially CRUD operations, especially in the SharePoint 2010 CSOM.

This post is about how to use it in PowerShell and a comparison of the performance.

How to use CSOM

First, there is a slight problem in PowerShell (v2 and v3); it cannot easily call generics such as the ClientContext.Load method. It simply cannot figure out which overloaded method to call – therefore we have to help it a bit.

The following is the function I use to include the CSOM dependencies in my scripts. It simply loads the two Client dlls and creates a new version of the ClientContext class that doesn’t use the offending “Load<T>(T clientObject)” method.

I nicked most of this from here, but added the ability to load the client assemblies from local dir (and fall back to GAC) – very useful if you are not running on a SharePoint server.

$myScriptPath = (Split-Path -Parent $MyInvocation.MyCommand.Path) 

function AddCSOM(){

     #Load SharePoint client dlls
     $a = [System.Reflection.Assembly]::LoadFile(    "$myScriptPath\Microsoft.SharePoint.Client.dll")
     $ar = [System.Reflection.Assembly]::LoadFile(    "$myScriptPath\Microsoft.SharePoint.Client.Runtime.dll")
    
     if( !$a ){
         $a = [System.Reflection.Assembly]::LoadWithPartialName(        "Microsoft.SharePoint.Client")
     }
     if( !$ar ){
         $ar = [System.Reflection.Assembly]::LoadWithPartialName(        "Microsoft.SharePoint.Client.Runtime")
     }
    
     if( !$a -or !$ar ){
         throw         "Could not load Microsoft.SharePoint.Client.dll or Microsoft.SharePoint.Client.Runtime.dll"
     }
    
    
     #Add overload to the client context.
     #Define new load method without type argument
     $csharp =     "
      using Microsoft.SharePoint.Client;
      namespace SharepointClient
      {
          public class PSClientContext: ClientContext
          {
              public PSClientContext(string siteUrl)
                  : base(siteUrl)
              {
              }
              // need a plain Load method here, the base method is a generic method
              // which isn't supported in PowerShell.
              public void Load(ClientObject objectToLoad)
              {
                  base.Load(objectToLoad);
              }
          }
      }"

    
     $assemblies = @( $a.FullName, $ar.FullName,     "System.Core")
     #Add dynamic type to the PowerShell runspace
     Add-Type -TypeDefinition $csharp -ReferencedAssemblies $assemblies
}

And in order to fetch data from a list you would do:

AddCSOM()

$context = New-Object SharepointClient.PSClientContext($siteUrl)

#Hardcoded list name
$list = $context.Web.Lists.GetByTitle("Documents")

#ask for plenty of documents, and the fields needed
$query = [Microsoft.SharePoint.Client.CamlQuery]::CreateAllItemsQuery(10000, 'UniqueId','ID','Created','Modified','FileLeafRef','Title') 
$items = $list.GetItems( $query )

$context.Load($list)
$context.Load($items)
#execute query
$context.ExecuteQuery()


$items |% {
          Write-host "Url: $($_["FileRef"]), title: $($_["FileLeafRef"]) "
}

It doesn’t get much easier than that (when you have the AddCSOM function that is). It is a few more lines of code than you would need with the server OM (load and execute query) but not by much.

The above code works with both 2010 and 2013 CSOM.

Performance Measurement

To check the efficiency of the Client object model compared to the traditional server model I created two scripts and measured the runtime and memory consumption:

Client OM:

param 
(
[string]$listName = $(throw "Provide list name"),
[string] $siteUrl = $(throw "Provide site url")
)

AddCSOM

[System.GC]::Collect()
$membefore = (get-process -id $pid).ws

$duration = Measure-Command {

          $context = New-Object SharepointClient.PSClientContext($siteUrl)
         
          #Hardcoded list name
          $list = $context.Web.Lists.GetByTitle($listName)
         
          #ask for plenty of documents, and the fields needed
          $query = [Microsoft.SharePoint.Client.CamlQuery]::CreateAllItemsQuery(10000, 'UniqueId','ID','Created','Modified','FileLeafRef','Title') 
          $items = $list.GetItems( $query )
         
          $context.Load($list)
          $context.Load($items)
          #execute query
          $context.ExecuteQuery()
         
         
          $items |% {
                  #retrieve some properties (but do not spend the time to print them
                  $t = "Url: $($_["FileRef"]), title: $($_["FileLeafRef"]) "
          }
         
}

[System.GC]::Collect()
$memafter =  (get-process -id $pid).ws

Write-Host "Items iterated: $($items.count)"
Write-Host "Total duration: $($duration.TotalSeconds), total memory consumption: $(($memafter-$membefore)/(1024*1024)) MB"

Server OM:

param 
(
[string]$listName = $(throw "Provide list name"),
[string] $siteUrl = $(throw "Provide site url")
)

Add-PsSnapin Microsoft.SharePoint.PowerShell -ea SilentlyContinue

[System.GC]::Collect()
$membefore =  (get-process -id $pid).ws

$duration = Measure-Command {
          $w = Get-SPWeb $siteUrl 
          $list = $w.Lists[$listName]

          $items = $list.GetItems()
          $items |% {
                  #retrieve some properties (but do not spend the time to print them
                  $t = "url: $($_.Url), title: $($_.Title)"
          }
}

[System.GC]::Collect()
$memafter =  (get-process -id $pid).ws

Write-Host "Items iterated: $($items.count)"
Write-Host "Total duration: $($duration.TotalSeconds), total memory consumption: $(($memafter-$membefore)/(1024*1024)) MB"

And executed them against a document library of 500 and 1500 elements (4 measurements at each data point).

The Results

Are very clear:

OMChart

As you can see it is MUCH more efficient to rely on CSOM and it scales a lot better. The server OM retrieves a huge number of additional properties, but it has the benefit of going directly at the database instead of the webserver. Curiously the CSOM version offers much more reliable performance where the Server OM various quite a bit.

In addition you get around the limitation of Shell access for the powershell account and the need for server side execution. Might be convenient for many.

Conclusions

The only downside I can see with the CSOM approach is that it is unfamiliar to most and it does require a few more lines of code. IF your specific need is covered by the API of course.

It’s faster, more portable, less memory intensive and simple to use.

Granted, there are lots of missing API’s (especially in the 2010 edition) but every data manipulation need is likely covered. That is quite a bit after all.

So go for it J

Supercharge your (Resource) Efficiency with Macros


This is part 4 of 4 in a series on how to improve the way we usually work with resource (resx) files.

I generally like to – and do – use resource files for all string constants that are shown to end-users, however I do feel that it is needlessly cumbersome, therefore these posts:

  1. A Good Way to Handle Multi Language Resource Files
  2. Verify that your Resource Labels Exists
  3. Find/Remove Obsolete Resource Labels
  4. Supercharge your (Resource) Efficiency with Macros (this one)

Generally these issues are generic to .NET and not specific to SharePoint, though that is where I’m spending my time writing this.

What is it again?

This is a macro to increase developer productivity. It is a topic that lies very close to my hearth and a healthy fraction of my posts are about automation and productivity. Every developer should really know the macro features especially the record/play shortcuts.

Working with resources (in SharePoint) is actually quite tedious – we write “$Resources:” a thousand times and occasionally cannot be bored and skip the chore.

This is simple a Visual Studio macro to improve that workflow, usage is:

1. You highlight a string

2. Press a shortcut (of your choice)

3. Write a name for the resource key to create(optionally use an existing if one exists with same value)

4. And you’re done.

It will then add the resource key to your existing resx file, it will type in “$Resources:….” as needed. If it’s an aspx/ascx file it will also throw in some “<%=” and “%>” tags.

(It works very well for xml files too.)

The Gory Detail

This is a VB macro script that took an inordinate amount of time to write mainly because the Visual Studio Macro object model (DTE) is one of the most hideous, awkward, APIs I’ve ever worked with. If there is one place the VS team could improve it would be here – the macro IDE is also tedious to work with.

I’m very certain that there are ways to make the code perform better (speed is fine) and look better (is it important?) – let me know your nuggets in the comments it’s always good to learn something new.

What you need to know is:

  1. It looks for a resx file with the same name as the target name of your project, i.e. the dll/wsp name
    • It only works on the culture independent resx file
    • It even adds a comment in the resource file as to where a key was first used J
  2. It works only for text editor windows
    • I have not found a way to make the selection work in the designer windows, specifically the feature designer would have been nice. This is annoying.

      The workaround is to simply choose to open the .feature file in the “XML (text) editor” (choose “Open with” in the file open dialog)

  3. You may need to customize the replacement pattern for cs and as?x files as it calls a simple utility method “ResourceLookup.SPGetLocalizedString” to translate the “$Resources:…” key (see code below)
  4. It handles quotes in/around the selection

At the moment it is only tested with VS2010 – I’m certain that changes need to be made for VS2013. In due time…

The resource lookup method I’m using is:

        public static string SPGetLocalizedString(string key)
        {
            if (!key.StartsWith("$Resources:"))
            {
                return key;
            }
            var split = key.TrimEnd(';').Remove(0, "$Resources:".Length).Split(',');

            if (split.Length != 2 || string.IsNullOrEmpty(split[1]))
            {
                return key;
            }

            return SPUtility.GetLocalizedString(key, split[0], (uint)System.Globalization.CultureInfo.CurrentUICulture.LCID);
        }

Download and Installation

It is simple:

  1. Download the Macro project here
  2. Dump it somewhere on your disk – the usual location is in “Documents\Visual Studio 2010\Projects\VSMacros80″
  3. Choose “Tools / Macros / Load Macro Project” and pick the DGMacros.vsmacros file
  4. Bind a keyboard shortcut for the text editor to the macro. Just write “dgm” in the search box and pick the “ReplaceStringWithResource” macro

  5. Have a try and perhaps a look at point 3 of the gory details above ;-)

Find/Remove Obsolete Resource Labels


This is part 3 of 4 in a series on how to improve the way we usually work with resource (resx) files. At least the way my team and I work with them.

I generally like to – and do – use resource files for all string constants that are shown to end-users, however I do feel that it is needlessly cumbersome, therefore these posts:

  1. A Good Way to Handle Multi Language Resource Files
  2. Verify that your Resource Labels Exists
  3. Find Obsolete Resource Labels (this one)
  4. Supercharge your (Resource) Efficiency with Macros

Generally these issues are generic to .NET and not specific to SharePoint, though that is where I’m spending my time writing this.

So – Near the End of the Project – are those Resource Entries Still in Use?

The issue at hand is that you have hundreds of source files of various flavors and they are sprinkled with references to a number of resource files. When code is refactored or just deleted what happens to old resource labels? Likely nothing at all.

Are you happy with a ton of useless resource entries no longer in active use? What if you had to translate it to a couple of languages?

Quite obviously this is no biggie code/quality wise, but still…

The answer is that you run the PowerShell script below, that’ll check – and optionally fix – it ;-)

The Script

I made a small script to check and remove the excess resource entries to slim down those resource files a bit.

The script will, given a starting location (i.e. the root folder for your solution)

  1. Go through every code file (to be safe every file, except a list of binary extensions) and look for resource labels of the form “$Resources:filename,label_key
  2. Search recursively for resx files
  3. For every one of those resx files it will look through the resource labels in use and flag those that it cannot find
  4. (Optionally) Do a “safemode” check where every file is searched for the resource label, i.e. necessary if you are using multiple/other resource lookup methods then the $Resource moniker
  5. (Optionally) If you choose you may remove them automatically but do make a dry run first to sanity check that you got the paths right and that you have all the source files to be searched

Usage:

    PS> & VerifyResxLabels.ps1 “path to solution dir” [-remove] [-safemode]

(Tip: Download the script to somewhere, write an ampersand (&) and then drag the ps1 file into the PowerShell window and then drag the solution folder to the window.)

You’ll definitely want to pipe the output to a file.

Limitations

There are obviously some limitations

  • It will not: Check out the resx files from source control (but it will show the file error in the output)
  • It will not: Respect commented out code – it’s simple pattern matching so commented out code will be treated as actual code (hardly an issue)
  • Safemode is very slow of necessity and will likely find false positives, i.e. it will play it safe and keep entries that exists in some files, even though the same label may be used in a completely different context
    • It’s an (O(n*m) algorithm, with number of files and number of labels) – My test with 1000 unique labels, 28 resx files and 2400 files it takes a night

Download the script here

A Good Way to Handle Multi Language Resource Files


If you are working with SharePoint you should also be using ressource (resx) files in your projects.

The problem is that SharePoint is quite annoying in the way it handles fallback to the default language resx when there is no culture specific version.

For instance in a Danish site it will look for MySolution.da-DK.resx and fallback to MySolution.resx if it isn’t o found.

Nice.

The Problem

However SharePoint will spam your ULS log with messages like:

04/20/2012 13:56:37.25     w3wp.exe (0x3758)     0x344C    SharePoint Foundation     General     b9y9    High     Failed to read resource file "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\14\Resources\MySolution.da-DK.resx" from feature id "(null)".    5b6991c9-5b39-4c95-b895-ed282bc00034 
04/20/2012 13:56:37.25     w3wp.exe (0x3758)     0x344C    SharePoint Foundation     General     8e26    Medium     Failed to open the language resource keyfile MySolution.    5b6991c9-5b39-4c95-b895-ed282bc00034 
04/20/2012 13:56:37.25     w3wp.exe (0x3758)     0x2C94    SharePoint Foundation     General     b9y3    High     Failed to open the file 'C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\14\Resources\MySolution.da-DK.resx'.    492a45e8-e416-4321-a83a-12f56c0341c1 
04/20/2012 13:56:37.25     w3wp.exe (0x3758)     0x2C94    SharePoint Foundation     General     b9y4    High     #20015: "" kan ikke åbnes: Filen eller mappen findes ikke.    492a45e8-e416-4321-a83a-12f56c0341c1 
04/20/2012 13:56:37.25     w3wp.exe (0x3758)     0x2C94    SharePoint Foundation     General     b9y4    High     (#2: "" kan ikke åbnes: Filen eller mappen findes ikke.)    492a45e8-e416-4321-a83a-12f56c0341c1 

And to make matters worse they are actually marked with “high” importance. In my mind this is a very normal way to code (for non-SharePoint projects) and should certainly not be marked with high importance. For now I am forced to look for the “unexpected” importance instead when troubleshooting.

The Simple Fix

There are a number of ways to correct this, I chose:

  1. Delete the duplicate localized version (MySolution.da-DK.resx) from source control
  2. Add a prebuild event to copy MySolution.resx to MySolution.da-DK.resx
    1. Write something like ‘copy /Y “$(ProjectDir)Resources\$(TargetName).resx” “$(ProjectDir)Resources\$(TargetName).da-DK.resx” ‘ (change to suit your needs and ensure quotes are not html mangled)
  3. Make sure to include MySolution.da-DK.resx in the project and mark it with “Build Action: Content”, “Copy to Output Directory: Do not copy”.
  4. Choose to exclude MySolution.da-DK.resx from Source Control in Visual Studio (File / Source Control / Exclude …)

    (otherwise Visual Studio will likely face write protected files)

All in all it should look somewhat like this:

Resx handling in the project

Other Ways

Other possible solutions would be to symbolic soft/hard link to the file (mklink.exe), however care should be taken as TFS really don’t like symbolic links.

The trick with exclusion from source control would likely work too, however a file copy just seemed like a simpler solution…

Another creative way would be to mess around with the packaging options it’s likely possible.

How to Make List Items Visible to Anonymous Users (in Search)


I had a funny little issue with showing list items from a custom list (with a custom view form) to anonymous users on a publishing site.

My good colleague Bernd Rickenberg insisted that I blogged the resolution since he had found quite a few post detailing the problem but no viable solutions ;-)

The Issue

You want to show a custom list to anonymous users. In our setting it was through search, your use case is likely different, but the issue remains the same with or without search.

Quite simply forms (List forms) are generally not accessible to anonymous users when you have the lockdown feature enabled (ViewFormPagesLockdown). It is a critical feature to have enabled otherwise you expose way too much information to SharePoint savvy users. Many of the solutions to this issue suggest turning it off.

It is fairly simple to test if you have this problem. Start Firefox, assuming that you have not enabled the NTLM auto login setting, and hit the …/DispForm.aspx page for a specific item. If you receive the login prompt as anonymous in Firefox and sail through when logged in this blog is for you.

If the ordinary publishing pages are also not viewable then this blog is not for you.

Note: Never use IE to test for anonymous access or not, I can’t count the number of times it has tricked consultants into thinking it works because they are tricked by the auto login feature while on the corporate network.

The Resolution

Is fairly simple.

The basics for search is that the list must be made searchable (it is by default) in the list settings, anonymous users must have access rights to the site and the lockdown feature should(!) be enabled.

What the lockdown feature does is that it changes the anonymous permission mask at the root site (which is inherited by all by default). The mask is basically the permission level assigned to all anonymous users and is similar to the normal permission sets – but is not editable in the UI.

The “View Application Pages” permission level is removed, see the image below on where to find it (for non-anonymous users):

Permission level settings page

Permission level settings page – not available for anonymous users

The best option, security wise, is to break the permission inheritance for your particular list and then add the “View Application Pages” permission to the anonymous users. Do not do that at the web level as you do not want to expose e.g. All site content etc.)

The Script

You need to run the following the PowerShell commands on one of your servers (replace url and list name):

$web = get-spweb "http://yoursiteurl/subweb/subsubweb" 
$list = $web.Lists["ListName"]
$list.BreakRoleInheritance($true)
$list.AnonymousPermMask = $list.AnonymousPermMask -bor ([int][Microsoft.SharePoint.SPBasePermissions]::ViewFormPages) #binary or adding the permissions
$list.Update()
 

(Note: Do be careful if copying directly from this page to PowerShell – you need to re-type the quotes as WordPress mangles them)

Script to Import/Export Metadata Termstore


Recently I’ve been using the Managed Metadata Store in SharePoint 2010 and been amazed by the lack of proper import/export functionality.

It feels like a blast from the past to be able to only import a CSV file… CSV?!? What happened to proper XML? What happened to Export? What happened to being able to transfer (meta)data between farms (like test and production) since the builtin Import insists on creating new TermSets and not update existing ones (and yes your managed metadata linked site columns do in fact store a strong reference, not just a name, so you’ll loose the link).

I couldn’t find any existing Powershell commandlets to the rescue either.

I couldn’t readily bing ;-) any usable scripts for this.

What I did

So I built a powershell script to take a CSV file and import it into the Term Store and merge it with any existing term store already there.

CSV?!?? Yeah…

Point is then you can still use the CSV file you likely already hold. You can use the TermSetImporter to export CSV files from your existing environment.

If you are starting Greenfield, then I recommend to use excel with some macros to create the Term Sets (then your users can create them instead of you) or you might just let your users loose in the term store manager.

How to use

First download my small script and the sample excel and CSV files.

Second, fire up powershell (on a SharePoint server), write:

. ./MergeTermSets.ps1 csvfile groupname urlForASharePointSite

Do remember the “dot space” at the start of the line. The second “./” is just the path for the ps1 file in this case.

The urlForASharePointSite is optional and will default to http://localhost:2010 which likely corresponds to a valid SharePoint Central Admin site on 50% of all SharePoint installations. Watch the output log. If something goes wrong it’s likely that you should have a look in your CSV file for errors and/or whether or not the managed metadata store is connected properly.

Notes:

  • I’ve tried to do some tricks to handle encoding properly and I also trim spaces which really causes the term store to stumble (it will trim spaces and every subsequent comparison the script might do will fail).
  • Note that LCID and the parent Terms need to be set on every line in the excel sheet. Don’t blame me I didn’t make that part ;-)
  • Terms are only added not updated, i.e. I don’t try to keep stuff like descriptions in sync
  • No fancy stuff, no merging, no deletions, deprecation etc.

Hope it’s useful for you too.

SharePoint Advanced Large Scale Deployment Scripting – “Dev and QA gold-plating” (part 3 of 3)


I have been battling deployments for a while and finally decided to do something about it.

;-)

The challenge is to handle a dozen WSP packages on farms that host 20-50 web applications (or host header named site collections) to complete the deployments in a timely manner, ensure that the required features are uniformly activated every time while minimizing the human error element of the poor guy deploying through the night.

The deployment scripts require PowerShell v2 and are equally applicable to both SharePoint 2007 and 2010 – the need is roughly the same and the APIs are also largely unchanged. Some minor differences in service names are covered by the code.

To keep this post reasonable in length I’ve split the subject into three parts:

Part 1 is the main (powershell) deployment scripts primarily targeted at your test/QA/production deployments

Part 2 is the scripts and configuration for automatic large scale feature activations

Part 3 (this part) is about gold-plating the dev/test/QA deployment scenario where convenience to developers is prioritized while maintaining strict change control and records

Note: This took a long time to write and it will likely take you, dear reader, at least a few hours to implement in your environment.

If you have not done so already read the two other parts first.

The Challenge

When you have a number of developers working on the same service line you need to be able to keep strict control of your QA and test environment.
So either you have one guy as the gatekeeper or you automate as much as possible. I’m the last type of guy so I’ve developed a “one-click” deployment methodology where the focus is
1. Simple and consistent deployment
2. Easy collection of solutions for next production deployment – you should deploy the exact code that you tested not a “fresh” build(!)

The Solution

Our build server is set up to produce date labelled zip files with the wsp and manual installer (exe) within folder of the same name.
I’m sure yours are different therefore the scripts work with a solution drop directory that accepts both zip files (that includes a wsp file) and also plain wsp files.
The folder structure is up to you I’ll just search recursively for zip and wsp.

The script files are:

03 DeployAndStoreDropDir.bat: The starting batch file that will execute the deployment process as a specific install user. That way the developers can login with their personalized service accounts and use a specialized one for deployment purposes. You need to change the user name in this batch file. The first time you run this you need to enter the password for your install account subsequent runs will remember it  (“runas /savecred”).

03b DeployAndStoreDropDirAux.bat: The batch file doing half the work. It’ll archive old log files in “.\ArchivedDeploymentLogFiles” (created if not present) and it execute “QADeploymentProcess.ps1″ that is doing the hard work. At the end it’ll look for error log and write an alert that you should look for it. Saves a lot of time – I don’t go looking in log files if I don’t get that message.

SharePointLib\QADeploymentProcess.ps1: The script that is handling the unzipping and storage of the zip/wsp files and executing both the deployment and feature activation scripts. The goal is to have a simple drop dir where for WSP/zips and a single storage location, “SolutionsDeployed”, for the currently deployed code. The “SolutionsDeployed” folder will be updated at every deployment, so that there is only the latest version of each wsp deployed. You do not need to drop all wsps in the drop dir every time; it does not just delete the SolutionsDeployed folder at every deployment.

This is how the files are moved around for a couple of WSPs and a zip file:

In short:

  1. If you want to start it with a specific install user, just execute “03 DeployAndStoreDropDir.bat”
  2. If you want to start it in your own name, execute “03b DeployAndStoreDropDirAux.bat” and perhaps delete the other bat file.
  3. If you want to modify how it works go to the “QADeploymentProcess.ps1″ file.
  4. If you want to work with zip files they can have the name and internal folder structure that you like. The one important thing is that they should only contain one WSP file. It is pretty smart; at the next deployment your zip file may be named differently (e.g. with a date/time in the name) and it will still remove the old version from “SolutionsDeployed” and store your new version with the new folder/zip name.
  5. At the end of your build iteration SolutionsDeployed will contain the latest version of your deployed code (provided that you emptied it at the last production deployment)

Closing Notes

It is not a big challenge to connect the remaining dots and have your build server do the deployment at certain build types and have a “zero-click” deployment, however I opted to not do it. It should be a conscious decision to deploy to your test and QA environments, not something equivalent to a daily checkin. Your milage may wary.

I would generally provide these scripts to everyone in the team and encourage them to use them on their own dev box once in a while – it always pays to have your dev environment resemble the actual production environment this makes it feasible to keep the code (almost) in sync on all dev environments.

The challenge may be the complexity of the scripts and the effort to understand them – you shouldn’t rely too much on stuff you don’t understand ;-)

Feel free to adjust the script a bit (likely QADeploymentProcess.ps1) to suit your environment.

If you make something brilliant then let me know so I can include it in my “official” version. Please also leave credits in there so it’ll be possible for your successors to find the source and documentation.

Download

Grab the files here – part 1, 2 and 3 are all in there. I’ll update them a bit if I find any errors or make improvements.

Note: Updated Aug 29 2011, minor stuff.

Don’t forget to change the user name in “03 DeployAndStoreDropDir.bat” to your “install” account.

Follow

Get every new post delivered to your Inbox.