Automating the world one-liner at a time…
One of our partners asked what the guidance was for a product that releases a cmdlet and then decides that they need to deprecate one or more of the parameters in a future release. We don't currently have guidance on this topic so I thought I would tell you what we were thinking of saying and then you can comment about whether that is reasonable or not.
The problem is that a bunch of scripts might have been written to use that parameter so if they just drop the parameter in the next release, it will break those scripts.
It's easy to say, "never drop a parameter" but that isn't realistic. Things change and teams need the flexibility to adapt.
Avoid dropping parameters whenever possible since doing may break customer scripts.
If you must drop a parameter, do so over the course of a couple of releases. Depreciate it in the next release and then drop it in the release after that.
Deprecate the parameter by:
Validate that all the data binding scenarios (pipelining) scenarios continue to work property after dropping the parameters.
One possibility is for PowerShell to provide explicit support for this scenario. We could create a DEPECREATED stream and have a DEPRECATED attribute that you could put on your parameters and then we could change the engine to look for this and issue a standardized error message. In the end, this would consume a lot of pm/dev/test calories for a scenario that I really hope doesn't happens. I'd rather spend those calories on some amazing remoting capabilities so I'm strongly disinclined to go down that path.
What do you think?
Jeffrey Snover [MSFT]Windows Management Partner ArchitectVisit the Windows PowerShell Team blog at: http://blogs.msdn.com/PowerShellVisit the Windows PowerShell ScriptCenter at: http://www.microsoft.com/technet/scriptcenter/hubs/msh.mspx
IMHO WriteWarning() is quite sufficient for this. There is no need for different stream for each thing :)
Is there existing functionality that could be enlisted to help.
Make it easier to change "legacy" scripts.
I'm thinking of a tool that, is fed a script and lists use of the deprecated parameter.
That seems reasonable to me. I think the team has bigger fish to fry. And if this becomes more of a pervasive issue in the future you can always look into the Deprecate attribute/stream approach.
You might also want to mention the scenario where you have found a better name for a parameter. You could spit out the warning on the old parameter name for a release or two before getting rid of it. Or you could just create an alias for the old parameter name on the new parameter.
> Make it easier to change "legacy" scripts.
> I'm thinking of a tool that, is fed a script
> and lists use of the deprecated parameter.
Excellent point. We are working on some stuff that might make this feasible.
> You might also want to mention the scenario
> where you have found a better name for a
> parameter. You could spit out the warning
> on the old parameter name for a release or
> two before getting rid of it. Or you could
> just create an alias for the old parameter
> name on the new parameter.
Good catch. Aliases are the way to deal with this.
Jeffrey Snover [MSFT]
Windows Management Partner Architect
Visit the Windows PowerShell Team blog at: http://blogs.msdn.com/PowerShell
Visit the Windows PowerShell ScriptCenter at: http://www.microsoft.com/technet/scriptcenter/hubs/msh.mspx
If a parameter has to be dropped then I think the proposed guidance is a more than adequate way of handling it. SQL Server and .NET for instance just document that the parameter\command hsa been deprecated.
I know that a lot of people don't read the documentation but that is no excuse.
Please put development effort into new functionality - especially remoting - rather than creating an unnecessary safety net
I'm not sure about the warning message. I'm new to PowerShell and pretty new to scripting in genereal, so maybe I should just shut up, but wouldn't a warning message really mess up output? I can imagine not everyone having instant time to fix the scripts that generate these errors as one upgrades to some SP level with an auto-upgrade of PS to whatever current version then available.. (Yes, you should test and so on, but in some environments, that might not be all to feasible)
> I'm not sure about the warning message.
I'd like more feedback on this specific issue. Richard points out that if you put it in the documentation, few people read it. So if that is the case, there is really no difference between having one release where it is supported but declared deprecated and just dropping it. People's scripts are going to break one day and they'll be surprised.
It was with that thought, that I decided that we needed to do better and leverage the fact that the team is given users one releaes to get all their scripts converted before they break. We may be able to provide tools to analyze scripts and warn people about many of the issues but there are a number of problems with this:
1) how will people know to run the tool?
2) They could run the tool, fix the scripts but the next day get a script which uses the deprecated switch.
3) The tool would only catch a subset of cases (because it would be a static analysis tool).
Writing a message to the Warning stream addresses this at the cost (as you point out) of generating potentially annoying output.
So is it better for teams to notify and annoy or document and break? Or is there a better solution?
Hmm, this is a tough one alright. I tend to fall towards the documentation side of this fence. Giving people the ability to mark parameters (or even cmdlets) as deprecated seems nice in principal, but would probably not end up being very useful in practice as people forget to use them or figure they won't need them. This also seems like only half the solution to a wider problem: what about providing a mechanism for notifying of behavioural changes too, aka "breaking changes?" Additionally, people would end up eschewing documentation in favour of a single attribute that may or may not have a line of text explaining the alternative.
Not quite in the same ball park, but look what happened when we let people mark activex controls as "safe for scripting" for example. ouch.
anyway, my 2 cents
I don't think a dedicated Deprecate pipeline is necessary; that's precisely what Warning is for as you suggest. I think the only piece of guidance I'd suggest is to NOT deprecate a parameter unless you absolutely MUST. That is, don't get rid of it just because doing so would be "elegant" and make your developers feel warm and fuzzy; get rid of it only if keeping it will compromise the cmdlet's operation in some fashion (e.g., totally conflict with another, new parameter). If possible, maintain a second "signature" for the cmdlet that continues to support the old parameter.
I think $WarningPreference would prevent anything being written to the Warning pipeline from becoming annoying. The big argument would be for someone who is using the Warning pipeline for their own script output; perhaps the Debug pipeline would be a better place to write deprecation remarks. Remember, the first version of a cmdlet to eliminate the functionality should STILL WRITE A DEPRECATION REMARK - that way, someone's script STOPS WORKING and they can still see WHY. Of course, the Debug pipeline is SilentlyContinue by default, which means you'd have to think to turn it on.
However, there SHOULD be an attribute which can be coded into the cmdlet to indicate a deprecated parameter; this would permit an analysis tool to detect it. When your script stops working, that would be a cue to run a static analysis tool.
Additionally, if the cmdlet can somehow internally encode, in a detectable way, deprecated cmdlets, then editing environments like PrimalScript could statically detect that and issue a design-time syntax alert a la Visual Studio. That would be a BIG help to scripters, and I'd be a huge advocate of some detectable "deprecated" attribute.
I'm a fan of notify and annoy. After all, there's an easy fix to the "annoy" part: Change your script, which you're going to have to do ANYWAY at some point, no? Fix it and the annoyance goes away.
Help files play an important role: The parameter must not only be listed as deprecated, but examples of alternatives should be provided (as in Visual Studio if you use System.Web.Mail - it advises you to use System.Net.Mail instead).
BTW, all of this should apply to entire cmdlets, too, in case they are being deprecated from a snap-in.
When running a script from the PS console, analyzing the script, send out warnings and then actualy execute the script would work nicely - no chance of messing up the output of a command (and messing up what gets piped through to the next command). This would, however, not solve the problem when running a script as a scheduled command. The only thing I can think of to solve that would be to analyze the script, log warnings to the Application Event log and then execute the script. If administrators never check the event logs between two cmdlet releases, well...
I was on the RTFM side of the fence until this particular topic bit me (although not from powershell) and now I'm thinking a depricated pipe and a required warning flag is a good idea. It took me hours to find out that someone else upgraded a package that I was using in a scritp that no longer understand certain parameters that I was using. A depricated warning would have saved me a lot of hair. What i would sugest is that the deprcation pipe should write a console warning that simply says "Depricated parameters used- see event log for details" It only needs to do this one then every explict use of a depricated parameter can be fully documented in the event log. This lets me keep my output clean (worst case is that you have to pipe your output to a remove-deprications cmdlet) and lets me know that there is something I need to take a look at in th eevent log for what's wbout to break in revision x+1. We need a seperate piepline because debug is off by default. This needs to be on by default. Powershell's strength is that things are discoverable. A seperate deprecated pipe allows things removed from a version to be easily discoverable. The talk about analysis tools and documentation is fine but when I'm writing a powershell script I usually do it interactively at the command line to see what errors I get. I then edit the transcript and save it off as a script.
I agree wih the idea the original plan. I think that a warning message should be displayed before running the script regarding it's depreciated parameters. The tool to identify the depreciated parameters is a great idea.
I know that I read the "what's new in this release" documentation, and if I know that I have a plethora of running scripts, and I know that depreciation of parameteres is a normal process, then I would be interested in it, and would keep an eye open for these. Finding them would be difficult, which is where the tool becomes helpful.
What would be great is to use gci | get-content | insert tool here to locate all scripts that have these parameters.