PowerShell “here” strings

I’m using verbatim strings in C# quite a lot. Even when I have no explicit need, for example because there’s no escape sequence nor line break and I could just avoid typing the extra @ at the beginning.

I definitely like the fact that there’s a similar syntactic behavior in PowerShell too.

These are called “here” strings (more information here).

Although these two features have much in common, you have to pay some more attention when using here-strings in PowerShell, since the behavior is not *exactly* the same you get when you write C# code.

For example, line breaks are mandatory.

The “start sequence” tokens are indeed @”+CRLF or @’+CRLF.

And the same applies to “end sequence” tokens as well (CRLF+@” or CRLF+@’ respectively).

And in case you are wondering what’s the difference between the single quote vs. double quote syntax, well… the single quote version disables escape sequences, whereas the double quote version does not.

Strings have always been tricky.

When I started writing C/C++ code against the Windows API I was struggling with LPSTR, LPTSTR, LPCSTR, LPCTSTR to name a few (probably any combination of characters here would be a valid name, I’m sure there’s a typedef or a MACRO definition somewhere even for LPABCSTR :)).

Then came COM and BSTR and smart pointers and… and that was real pain!!

How to quickly identify large lists with PowerShell

It’s easy, and it’s just one line (without word wrapping J).

Get-SPWebApplication http://webappurl | Get-SPSite | Get-SPWeb |% { $_.Lists | select @{n=“Url”;e={$_.RootFolder.ServerRelativeUrl} }, Title, ItemCount } | sort ItemCount -Descending

Here I’m traversing the contents of a Web Application in order to iterate over each List (in each Web, in each Site of the Web Application):

Get-SPWebApplication http://webappurl | Get-SPSite | Get-SPWeb |% { $_.Lists | … }

Then, I’m using the Select-Object command-let to add a custom object to the pipeline, built from the list data using both standard properties (Title, ItemCount) and a calculated property for the list url:

{ $_.Lists | select @{n=“Url”;e={$_.RootFolder.ServerRelativeUrl} }, Title, ItemCount }

Finally, it’s just a matter of manipulating the custom objects collections, applying sorting, filtering, grouping or any other set operations according to your specific needs.

Here’s the output you may receive:

In addition, you may choose to bring these data into Excel for further analysis, which is extremely easy to achieve adding the Export-CSV command-let at the end of the pipeline.

Here’s a sample Excel spreadsheet generated from the data above, where I’ve applied a custom number filter to the data set and I have created a line chart on the data series:

Grouping by a multi-value Managed Metadata Field using XSLT and a DataFormWebPart

First and foremost, I think that there’s a reason why grouping (generally speaking) is usually disabled when the group-by criteria is on a multi-value field.

Performance is probably the key point here: especially when you are dealing with semi-structured data (i.e. SharePoint list items), you need to extract a set of unique values first, on which, in a second pass, you apply your filtering logics.

Also, but this is just my personal opinion, having a result set where a single item may appear more than once (if it “belongs” to more than one group) is far from optimal. I would definitely prefer to rely on refiners, rather than on pre-pupulated item groups.

And finally… where, it’s a little bit tricky J

Here’s what I have done.

Scenario.

A document library with a Managed Metadata field which allows multiple values, named “OU” (which stands for, guess what, Organizational Unit J).

A DataFormWebPart fetching all items from that library.

And, of course, XSLT!

Implementation.

I started implementing a recursive template that parses the value of the taxonomy field and produces an XML tree.

<xsl:template
name=extractOU>

<xsl:param
name=list />

<xsl:variable
name=thisId
select=substring-before($list, ‘;#’) />

<xsl:variable
name=thisName
select=substring-before(substring-after($list, ‘;#’),’;#’) />

<xsl:variable
name=left
select=substring-after(substring-after($list, ‘;#’),’;#’) />

<ou
id={$thisId}
name={normalize-space($thisName)} />

<xsl:if
test=$left>

<xsl:call-template
name=extractOU>

<xsl:with-param
name=list
select=$left />

</xsl:call-template>

</xsl:if>

</xsl:template>

When applied to a field value such as:

1;#HR;#2;#Finance;#3;#Marketing;#

It would eventually produce an XML fragment similar to this one:

<ou
id=1
name=HR />

<ou
id=2
name=Finance />

<ou
id=3
name=Marketing />

 

I applied this template to the overall result set which is returned by the list query, saving the output into a variable:

<xsl:variable
name=items
select=/dsQueryResponse/Rows/Row />

<xsl:variable
name=groupsTree>

<xsl:for-each
select=$items>

<xsl:call-template
name=extractOU>

<xsl:with-param
name=list
select=concat(normalize-space(@OU.), ‘;#’) />

</xsl:call-template>

</xsl:for-each>

</xsl:variable>

This would generate an XML fragment with duplicate elements:

<ou
id=1
name=HR />

<ou
id=2
name=Finance />

<ou
id=3
name=Marketing />

<ou
id=2
name=Finance />

<ou
id=3
name=Marketing />

<ou
id=1
name=HR />

<ou
id=2
name=Finance />

 

In order to extract a set of unique rows (a “distinct” operation) and to iterate over its output, I had to first load the XML tree into a nodeset:

<xsl:variable
name=groups
select=msxsl:node-set($groupsTree)/* />

 

I used the MSXML node-set function, which is not necessarily the best way to achieve this goal, but unfortunately I could not use XSLT extensions (without extending the DataFormWebPart, which was out of scope here) nor I could use XSLT 2.0, which is not supported as well.

On that node set I could finally apply standard XSLT grouping techniques.

In this case I chose the Muenchian method (here you can find an excellent reference).

I defined a key based on the name attribute of my ou element:

<xsl:key
name=organizational-units
match=ou
use=@name />

And I finally used that key to filter the items within the overall set:

<xsl:for-each
select=$groups[count(. | key(‘organizational-units’, @name)[1]) = 1]>

<xsl:sort
select=@name />

<xsl:variable
name=id
select=@id />

<xsl:variable
name=ou
select=@name />

<div
class=certification well>

<h2
class=subtitle text-center>

<xsl:value-of
select=$ou />

</h2>

<ul
class=unstyled inline>

<xsl:for-each
select=$items[contains(@OU., concat($id,’;#’,$ou))]>

<xsl:sort
select=@Title
order=descending />

<li
style=width:170px>

<div
class=wrapper
style=position:relative; text-align:center>

<xsl:value-of
select=@CertificationIcon0
disable-output-escaping=yes />

<div>

<a
title=@Title
href={@FileRef}
target=_blank>View</a>

</div>

</div>

</li>

</xsl:for-each>

</ul>

<div
style=clear:both></div>

</div>

</xsl:for-each>

 

As a final note, performance implications are quite noticeable here.

I have n+1 operations on the result set (1 to get the groups, n to extract values for the specific group I’m processing).

And I use a contains criteria to get the items back.

I could accept this since this library will only have a bunch of items (a few dozens), but this may not always be the case 😛