On an internal mailing list the question came up of how to take this stream of objects going in the pipeline and break it up into arrays of a given chunk size - so if you had 100 objects on the way in and decided to group in chunks of 10, you'd get 10 arrays out, or in chunks of 5, 20 arrays out, etc.  Many of you will know this is very much like xargs -n

George Xie answered this one with a "grouppipe" function which handles this - just gather the input objects into an array until it reaches the target size, then emit that array.  Repeat until done, and when the function is done send out the partially-filled array if we didn't have an evenly-divisible-by-the-chunk-size number of elements.

You can see how it handles not-evenly-divisible here:

PS C:\> 1..100 | grouppipe 7 | select -last 1
99
100
PS C:\> 1..100 | grouppipe 8 | select -last 1
97
98
99
100
PS C:\> 1..100 | grouppipe 9 | select -last 1
100
PS C:\> 1..100 | grouppipe 10 | select -last 1
91
92
93
94
95
96
97
98
99
100

 

function grouppipe ([int] $size)
{
    begin
    {
        $currentArray = @();
    }
    process
    {
        $currentArray += $_;
        if($currentArray.Count -ge $size)
        {
            ,($currentArray);
            $currentArray = @();
        }
    }
    end
    {
        if ($currentArray)
        {
            ,($currentArray);
        }
    }
}