Step up your FOREACH Game

Step up your FOREACH Game

  • Comments 12

Last night Bruce Payette and I were interviewed on the PowerScripting Podcast with Hal Rottenberg and Jonathan Walz.  It was a blast!  There were a ton of people connected and were firehosing questions & comments in the Chat window.  There were lots of comments/discussions about Twitter.  I confessed to having never used twitter. 

This morning I got an email from Jeffery Hicks recommended a site called TweetGrid which lets you "monitor the tweet stream" without a Twitter account.  (If I get an account, do I become a Twit?)  He sent me a link to a page with search panels for PowerShell and CTP3.

I'm not ready to recommend it yet BUT the very first item on the bullet was from depping with a link to a blog post by  Hugo Peeters about script that adds RDM size info to a VI client using PowerShell.  I love looking at other peoples scripts - it gives me a view into how they think about the world and helps me understand what patterns are in use and gives me ideas about how to evolve the language and engine.

Hugo wrote a nice script - it is very easy to read and maintain.  One thing caught my eye and I left him a note about it.  He wrote:

$VMs = Get-VM
ForEach ($VM in $VMs)

This is perfectly fine code - it is the traditional way foreach loops are used.  But foreach loops in PowerShell are much more than that.  I've found that wherever I had code like that, I'm now converting it to the following form:

Foreach ($VM in Get-VM)

 

In PowerShell, the COLLECTION part of the statement can be ANYTHING EXPRESSION OR STATEMENT THAT GENERATES A COLLECTION.  So you can use statements in your foreach.  I don't have VMWARE so I can give you good VM examples but here are some examples of what I mean

foreach ($p in Get-Process *ss)
{ $p.Name}
# A cmdlet is a statement which generates a collection.  Don't worry if it
# only generates 1 value - PowerShell casts it to a collection

foreach ($p in Get-Process |where {$_.handles -ge 500})
{ $p.Name}
# A pipeline is a statement as well

foreach ($p in Get-Process |where {$_.handles -ge 500} | sort handles)
{ $p.Name}
# There is no restrictions on what you do in  your statement

 

If  you want to do something which is multiple statements, you can put them inside a $() and they are considered a single statement.
foreach ($p in $($max=700; gps |where {$_.handles -le $max})) {$p.Name}

 

Here is why I like this usage of foreach:

  1. It is more readable/maintainable.
  2. It is smaller.  IF the code is readable, smaller code is generally better.  There are fewer things to keep track of and fewer things that can go wrong.
  3. I don't have to think up a good variable name.
  4. I get a charge of the fact that you can do this in PowerShell (not a good reason but it's honest. :-) ).

Thanks for the nice script Hugo!

Thanks for the recommendation Jeffrey!

Thanks for a great podcast Hal and  Jonathan!

Cheers!

Jeffrey Snover [MSFT]
Windows Management Partner Architect
Visit the Windows PowerShell Team blog at:    http://blogs.msdn.com/PowerShell
Visit the Windows PowerShell ScriptCenter at:  http://www.microsoft.com/technet/scriptcenter/hubs/msh.mspx

Leave a Comment
  • Please add 6 and 7 and type the answer here:
  • Post
  • To me, one of the best things in Powershell is the fact that "readable" is literally readable. I mean "foreach ($process in Get-Process | where {$_.handles -ge 500})" is readable for almost anybody as "For each process (that results from getting the processes) where the handles are greater or equal than 500".

    Gotta love it!

    I have less of a problem thinking of variable names. I always thinks of what the variable represents and use plural for collections. Your "trick" does prevent me from using the silly line "for each process in processes" ( ForEach ($Process in $Processes){...} ).

    Hugo

    (Also non-present on Twitter, but you can find my blog here: http://www.peetersonline.nl )

  • So one question remains: Why or when should we use foreach and when should we pipe the collection to ForEach-Object?

  • Remember, I'm JeffERy (or plain Jeff will do). My Mom wanted me to be different.  No comments, please :-)

  • Jeffrey,

    go ahead with those "simple" articles! This is the piece of great PowerShell mosaic I love so much. Your last posts are superb for me :)

    Thank you very much,

    David

  • With the anti-pattern you showed you have to check if the collection is set to $null (i.e. no objects were selected) or you may run the foreach with the iterator set to $null.  Putting the selector into the "in" clause of the foreach loop automatically removes the need to check the collection variable.

  • 1. Will the TMG Management Console Fall Victim to PARS?

    I had an interesting conversation last week with someone who's been in the Microsoft computing industry for almost two decades. The topic was PowerShell and Exchange Server 2007. He was saying that he was surprised that the uptake of Exchange 2007 was much slower than he anticipated, given the significant improvements included with the new e-mail server. He suspected that the reason for the slow upgrade cycle was the new requirement for 64bit computing, thus taking away the in-place upgrade option.

    I said that hardware issues might be the case, but I doubted it. New hardware is relatively cheap, and when you look at the new features and capabilities included with Exchange 2007, new hardware seems to be a very low barrier to adoption. "No", I said. "It's not the hardware. It's the 'PowerShell Abdication of Responsibility Syndrome (PARS)'".

    PARS? What in the world is that? PARS is the reason why the Exchange Server 2007 management console is so dysfunctional. PARS is the reason what the upgrade cycle for Exchange Server 2007 is so much slower than you would expect for an otherwise impressive product. PARS is the reason why the Exchange developers and product team decided not to include a management interface on par with the Exchange 2003 management interface.

    PARS is a dangerous thing, because it plays into some basic insecurities a lot of Microsoft admins have regarding their skills sets. Many Microsoft admins became Microsoft admins because they preferred working with a rich graphical interface that allowed them to get their work done without having to learn a new programming or management language. These are smart people who enjoy their work with computers, but like anyone else who is not good with foreign languages, prefer to use a professionally developed operating system that is "fully baked" and does not leave it up to the customer to "finish the job".

    Given my background in medicine, and my understanding of the history of medicine, I have often thought the insecurities Microsoft admins have regarding command line interface management to be a strange thing. When Unix or Linux admins enter the room, the Microsoft admins often feel, or are made to feel, inferior because they do not know how to manage their machines using an arcane, undiscoverable, typo-ridden, antiquated command line interface.

    Why would this surprise someone coming with knowledge of medical history in the United States? Because at one time, the physician was like the Unix or Linux admin. He had to set up the X-ray machine himself, develop the X-rays himself and interpret the X-rays himself. If he needed a CBC (complete blood count), he had to count the red cells, white cells and platelets himself, he would have to do the differential himself (count different types of white cells), and manually record the results. And if blood chemistries needed to be done, he had to gather the blood himself, put together the reagents, and go through the long and laborious tasks required to figure out the levels of Sodium, Potassium, Chloride and Carbon Dioxide, as well as a few dozen more important elements and molecules.

    Today, all of these processes are automated with machines using computer technology. Today's physician does not need to read his own x-rays, today’s physician does not need to count his own CBCs, today's physician does not need to figure out his own blood chemistries. Machines with computer technology in them do this for him and he does not need to input a single line of code to make them work.

    That's right. None of the machines that physicians and technicians use require them to use a command line interface for them to work. Why? Because it would not be accepted by the customers. The customers would come back and say "Ah, excuse me. What is this? You need to finish this product before pawning it off on customers. There are supposed to be buttons to push so that everything it does works - you cannot foist your own lack of development efforts on me, your customer, and make me learn some kind of programming and scripting language. You must create a complete product and then sell it to me".

    You could never get away with PARS in medicine. The reason why the Exchange Team got away with PARS is because our industry is very immature. Unlike medicine, which can be considered a mature industry with a well defined division of responsibility IT has not recognized a division of responsibility that allows optimal efficiencies for practitioners. Instead, today's IT Pro has to be lab tech, x-ray tech, coulter counter, radiologist and general practitioner.

    Given these unrealistic expectations, you can see that any product group that leverages PARS to reduce their development costs (by creating underpowered and dysfunctional user interfaces) is doing a profound disservice to their customers. Not only that, but they are failing to help the industry mature.

    Why mention the horrors of PARS in the ISAserver.org newsletter? Because it is likely that the next version of the ISA firewall, The Forefront Threat Management Gateway (TMG) is going to include PowerShell support. My concern is that the ISA dev team, who I have considered the thought leaders when it comes to creating elegant, discoverable, operational and comprehensive user interface designs, will fall prey to PARS.

    How could a team so dedicated and so successful with management interfaces be susceptible to PARS?

    It would not take that much. For example, consider the stresses a dev team is under to get a product out in time. Then introduce a decision maker who does not appreciate the need to mature the industry, as medicine has matured. Finally, bring in a very small, but very vocal, subset of customers who demand command line management so that the management experience is similar to other firewalls.

    "Such a "perfect storm" of negative events could turn the former "best of breed" interface included with the ISA 2004 and ISA 2006 products into something akin to the hamstrung and ungainly interface included with Exchange Server 2007."

    So, this month's editor's corner is public plea to the TMG development team. Please, I understand that there are pressures on you to include PowerShell support for the TMG. I even admit that there might be a place where TMG support might be useful. But on the behalf of tens of thousands of ISA firewall fans, I beg of you not to abdicate your responsibility for creating a world class user interface for the TMG firewall. A user interface that allows us to get 99.6% of our tasks accomplished through the UI, and a user interface that requires us to drop down into PowerShell only one or twice a year, and only in emergency circumstances.

    Please, do not go down the dark path of the Exchange Server 2007 product group!

    HTH,

    Tom

  • @Tom:

    We've never viewed PowerShell as a CLI VS GUI discussion but rather a CLI AND GUI one.  Specifically, world-class GUIs layered on top of CLI to ensure that everything you can do, you can automate.  

    One of the problems I've seen is that when a new technology comes on the scene people SWITCH to that technology instead of INTEGRATE it into existing systems.  That is how we got into the mess we find ourselves in now.  People SWITCHED to GUIs and (largely) abandoned CLIs.  Honestly though, I don't think that there is a risk of that happening here.  GUIs are what made Microsoft great - they are part of our DNA.  All we are doing is to implement them differently to ensure that we can also automate things through CLIs and scripting.  That is how you drive costs down and increase quality and reliability.  Given the difficult economic times before us, I can think of few things more important.

    Now with regard to the specific of your post.  First I think the data does not support your assertions regardion Exchange adoption - I'm not at liberty to provide details but my understanding is that it going very well and that it's support of PowerShell is one of the key factors in this.

    Let me be quick to acknowledge that PowerShell requires learning and that Exchange admins are the first wave of people that need to go through that.  Learning new stuff is always a pain in the butt.  That said, once you learn PowerShell, picking up the next set of PowerShel enabled products becomes a breeze.  I spoke to an admin at IT Forum that said just that.  He started with Exchange 2007 and then easily picked up SCVMM and SQL 2008 and was very productive very quickly with them.

    You also bring up the point of how much admin can be done in the GUI vs the CLI. Now you could argue that 100.0% of operations should be done from the GUI.  That has been historically been the thinking so reasonable people can be on that page.  The problem with that model is that as products become more functional to cover more and more corner cases and scenarios, the GUI can get complex.  Exchange's thinking was to leave the advanced scenarios to CLIs to allow the GUI to be as simple and straightforward as possible. I strongly agree with that approach.  Here I'll draw the distinction between the thinking and implementation. The Exchange team acknowledges that they left out a number of important scenarios from the GUI.  That doesn't mean that their thinking was wrong but rather that they didn't didn't draw the line in the right place.  They are a great team and they listen to their customers so they've addressed many of these issues with their SP1 release.

    Let me be absolutely clear on one point - the Exchange team has been, and continues to be, the thought leaders in delivering world class administrative experiences.  You'll hear more about this as they roll out their next version of Exchange.  Everyone would benefit the customers by following the Exchange team's lead here.   The good news is that that has been the case and that the teams that have followed them have been doing excellent work.  Teams have truly impressed me by drilling into how customers actually use features and designing well thought out interfaces for them.

    Let me close by being clear on one point Tom.  It sounds like you are the sort of admin that wants to spend 100% have your UX delivered through a world class GUI.  We expect that as teams embrace PowerShell, that you'll continue to have at least as good an experience as you've had in the past and more likely, a better one.

    Happy Holidays!

    Jeffrey Snover [MSFT]

    Windows Management Partner Architect

    Visit the Windows PowerShell Team blog at:    http://blogs.msdn.com/PowerShell

    Visit the Windows PowerShell ScriptCenter at:  http://www.microsoft.com/technet/scriptcenter/hubs/msh.mspx

  • @Tom,  

    What a bizarre road to a strange conclusion, and your analogy sucks too.  I have seen many articles like this raising a 'specter of "SOMETHING OMINOUS"' from you over the years that I plain stopped reading your sites.

    We are itching to switch to Exchange 2007 in large part because of PowerShell.  We held off this year because of a contract win that required a significant portion of our companies IT staff to implement, not because of PowerShell.

    PowerShell is not the issue on the switch to Exchange 2007.  My co-workers are slowly learning it and most are eager as it looks to provide them a number of ad-hoc tools that they can assemble to their needs quickly and efficiently and re-use over and over again.

    Steven Peck

    http://www.blkmtn.org

  • Tom,

    I think Jeffrey's response is very thorough, but I would like to expand on one point.  Windows has a rich history of providing top of the line GUI interfaces.  Unfortunately, the automation story was very poor.  

    I am relatively new to the IT field (2 1/2 years) and found myself needing to complete a number of similar tasks repeatedly.  Fortunately for me, PowerShell was coming on to the scene as I was investigating automation technologies like Cygwin or Python.  Neither of those products offered the integration with the Windows environment like PowerShell.  To make the pot even sweeter, product groups (like the Exchange team) were beginning to build management interfaces on top of PowerShell.

    By taking the approach of layering the GUI on top of PowerShell, it provides the great flexibility in IT infrastructure management.  The GUI is there for those that are comfortable with it.  The GUI also provides a "glide path" to creating automation solutions with PowerShell by showing the cmdlets run to complete a task (almost going back to the old idea of a wizard that walked you through and taught you to complete a task).  Finally, those admins who have embraced the command line can easily pick up new tool sets and use the discoverability innate in PowerShell to get up to speed with these new tools.

    To add even more to the story, software vendors have begun to implement this same strategy.  Companies like Special Operations Software have an Active Directory Users and Computers extension that has PowerShell cmdlets and shows you how to use them.  Quest offers PowerGUI, which provides an MMC-like feel, but is completely extensible in PowerShell (and the community has risen to the challenge, creating management interfaces for a number of products, including Exchange 2003 and WSUS).  A user of PowerGUI needs no PowerShell experience, but the PowerShell is there, right beneath the surface, and available to anyone to learn at their own pace (or the pace of their environment).  

    I think the Exchange team did a great job with the first major implementation of a GUI layered on PowerShell.  These will only get better as teams learn from experience and customer feedback.  Any time there is a change in how a product is managed, there are going to be issues and adjustments.  PowerShell is the future in Windows Administration, whether from the command line, hosted in a custom application, or lurking beneath the administrative GUI.

  • Not every admin wants to remember each and every command.

    Tom is right. Give both GUI and CLI options. Right now what the exchange team has done is CLI first, second and third.

    GUI was an after thought.

    Like all monopolies (GM, Ford etc once upon a time), Microsoft is doomed.

  • @Mehul

    One of the most beautiful things about PowerShell is that you don't have to remember every command.  PowerShell's discoverability mechanisms allow you to find the commands when you need them.  

    If you need a command to work with a particular type of object, like "Get-Command mail -commandtype cmdlet".  Get-Help can work in a similar way.  I don't know about you, but in GUI's that manage complex applications that I don't touch for a while, I have to refer back to the help.

    The intriguing thing for me is with the new GUIs layered on top of Powershell is that it provides a way for me to learn the PowerShell by working in the GUI.  Exchange was the first to the table with a GUI layered on PowerShell, and V1 of anything almost always leaves room for improvement ("to ship is to choose" is a quote I've heard many times).

    If the Exchange GUI really doesn't do it for you, check out some of the community projects like PowerGUI, which is an MMC-like console layered on PowerShell, and has add-ins for Exchange and many other products.  

    With the Exchange team providing a fully functional command line, they made it possible for others to create scripts to extend functionality. Being scripts, they are easy to share amongst a greater community (not so easy with the MMC).

    Don't dismiss the command line as a management tool, you could be missing out on something to help automate your workload.

  • Related to my previous post on creating groups from a csv file , I thought I would take this one stage

Page 1 of 1 (12 items)