Last week I had the pleasure of presenting to the Atlanta PowerShell User Group. The presentation was available over live meeting, if you missed it the recording can be found below. The scripts used in this presentation can also be downloaded here. Note we had a little problem getting the audio working so fast forward to the 2:00 minute mark.
My NetApp fanboyism is fairly well established at this point. I devoted almost three years to developing a PowerShell module that would teach NetApp how PowerShell worked. To my delight a year ago at TechED they took the storage world by surprise when they shipped an official module. I was then humbled to learn that they had found PoshONTAP, and used it as a blueprint when developing the DataONTAP PowerShell toolkit. This experience taught me a valuable lesson. If you believe in something… just do it… life has a funny way of sorting it all out. I took this philosophy an started anew , and my career ski rocketed. The affirmation of all that work was when NetApp approached me six months ago. At first I didn’t believe it was possible, but after many grueling interviews I started to realize I could do this job. Fast forward two months and I’m all settled in at NetApp, and LOVING it! This place is the google of the storage/IT world. Everywhere you look there is another brilliant engineer, but something was still missing… The PowerShell toolkit had shipped and was doing fantastic, but PowerShell still wasn’t in the forefront of developers writing the tools for a windows administrators. I thought about what I could do, again if you believe in something just do it. Enter PowerGUI.
A while back Kirk had approached me about writting a PowerPack for NetApp, and I always thought it was a great idea. So… I learned, and learned fast. Turns out a PowerPack is fairly simple to author. About a week later I had a working demo to show around internally. I had my position papers worked out, and had the elevator pitch ready. I was going to hit the streets and sell PowerShell using PowerGUI as my catalysis. Then something amazing happened, everyone I showed PowerGUI to loved it! Furthermore they instantly knew the advantage of building a tool on top of PowerShell had. I didn’t have to sell anything the tool simply sold itself. At MMS this year if you stopped by during a lull you would have gotten an early alpha demo of the PowerPack. I gave my boss said demo, and his guidance was clear, SHIP IT! Over the past month with the help of the whole team, we went through the gambit from legal to marketing, but in the end my secret project shipped yesterday. What started out as a sales tool ended up being such a compelling user interface that we just couldn’t keep it to ourselves.
If your a PowerGUI user of just a NetApp customer check out PowerGUI, and the PowerPack, as I think it is quite compelling.
Update: There is a known bug in PowerGUI where the first two objects returned don’t display alias properties, as the DataONTAP Toolkit uses alias’s heavily many object appear to be blank. Kirk’s team know about the issue and are working to fix it.
The 2011 Scripting Games are almost upon us, and I for one am very excited. There will be some modifications made to the games this year. For one this will be the first year that the games will only be in one language. This was done for several resons, but they all boil down to one…. There is simply no reason to not be using PowerShell at this point! As a reformed VBScritper I for one welcome the change.
Another modification is that you will only be allowed to enter in one catagory. There was some heart ache last year, because I participated in every event. This raised the bar so that anyone who wanted to win would have had to produce 20 scripts in 14 days. The other side of the coin was that I was unfairly entering scripts that a true beginer shouldn’t have to compeate against. Both are valid points, and in hind site perhaps entering in both catagories was a poor choice on my behalf. That said, I welcome only participating in a single catagory, as delivering two high quality scripts per day for two weeks was a daunting task, and I was burned out after two weeks.
Another potentialy game changing modification is that the submitions may not be visiable until after the deadline for a given challenge. Again I welcome this change as last years games only had one or two original solutions per event. By making the submitions private until after the deadline every solution will be original. This change by it’s very nature will enable cleaver, and uniqe approaches to a problem. I don’t think cheeting was a problem last year, but instead if you saw a script that used a rare .net object, and that script got 5 starts…. well doesn’t that influence you to use the same rare .net object yourself? Of coarse it does, it’s only human nature to want to deliver the best solution possible.
At the end of the day whether your new to scripting or a seasoned pro I highly encourage you to participate in the games this year. I for one will attempt to defend my title, but there will be stiff competition, and I will be shocked if I can pull off a back to back championship. Oh yeah, and did I mention first prize will land you a free pass to TechEd Atlanta!
Recently I was asked to do a guest post on the illustrious “Hey, Scripting Guy!” blog. I was very happy, and excited by this opportunity, but didn’t know what to write. I kicked it around for a couple days, and still nothing. So I called an old friend and asked him what his pain points were. He informed me that his shop was now operating under a 24×7 operations cycle, and it was impossible to keep his 3000+ workstations patched. With that I had my topic; having worked in these environments before I was acutely aware of the strain they put on the sysadmin to stay compliant. A couple days later I had compiled all the best techniques for detecting if a user was active on his/her PC, and shipped my draft off the scripting guy. This morning Google reader greeted me with my very own post!
I’ve had the pleasure of spending the last several days talking to the development team here at NetApp about the DataONTAP PowerShell Toolkit. As a result we’ve all learned alot, one of the more interesting features they brought to my attention was the credential management solution included with the toolkit. I found this very compelling, you see embedding credentials within a script is as old as scripting itself. There was a time not too long ago when it was considered taboo. However with PowerShell came access to the .net Security.Cryptography encryption/decryption methods. Scripters have unknowingly been accessing said methods indirectly whenever they would use the credential management funcions that Hal amd BSonPosh wrote long ago.
Which brings me to the DataONTAP toolkit. The Development team has steped it up a notch and included a full credential management solution with the latest version of the toolkit. The way it works is first you need to save logon information by using the Add-NaCredential cmdlet to save the credentials for a given NetApp controller onto the local machine. Then the next time you run the Connect-NaController cmdlet the credentials you previously saved will be used. So how do we use this new feature, and why do you care?
While driving up to RTP today I was listening to the PowerScritpting podcast episode 140. Hal and Johnathan received a question about execution policy settings. The conclusion they reached was that remote signed was a good compromise. I would like to expand on this a bit. The real fear with script execution is that you’ll unintentionally run code that has malicious intent buried within it. Personally I don’t run around running code before I review and test it, but that doesn’t mean I’m safe. As PowerShell MVP and trainer extraordinaire Don Jones has previously stated the risk is with your profile.
The VMware vSphere PowerCLI Reference will officially go on sale April 12th, and if your a vSphere administrator this one is a must have. We gathered the collective automation experience of four vEXPERTS and a MVP, and then filtered it through a fifth vEXPERT. The end result was a collection of polished solutions that are not only technically proficient, but more importantly will work the first time every time! Having all built and maintained large infrastructures we combined our collective knowledge and wrote a complete reference, we cover the entire life cycle from creating your environment to monitoring it long term. I mean it, we left no stone unturned where VMware had no PowerCLI solution we wrote our own. This book includes a PowerShell Module with 79 custom advanced functions! We also considered how you would use our book, and chose to take a task driven approach, this enabled you to just flip to the answer you need. You don’t have to “read” all 780 pages… instead think of this book as a fire extinguisher for your virtual infrastructure.
So that’s great, but who needs a book? With blogs and the community forums you can just find the answer right? Yes, yes you can eventually, but if you do that your shopping at a fishery. In this book we WILL teach you to fish! As a commitment to that end we created a dedicated web site to support our work. If you find a bug, or if we missed something you can post a question in our forum!
Admittedly this is my first published work, but I couldn’t be prouder of it. In my opinion The ONYX, and vSphere SDK chapters alone would be worth the cover price. I look forward to the release, and to your honest opinions of our work!
Within the PowerShell community there has been a lingering debate over modules and providers. Initially everyone seemed compelled to do both. Personally, I’ve never been very impressed by third party providers. Mainly because they always felt like a gimmick. They forced the file system analog, and the results where not very good. They were buggy, slow, and didn’ t support the standard provider hooks. This lead to many vendors never bothering, and focused instead on cmdlets. I myself had come to the conclusion that providers where something for the PowerShell team, and third party ISV should just leave them alone. Fortunately the provider in version 1.3 of the DataONTAP PowerShell toolkit has broken the mold and renewed my faith in providers!
I’m currently winding down on a datacenter build that has consumed me for the better part of six months. Last night our team went through and stood up vSphere on 200+ hosts. I know that’s nothing for you cloud providers, but that’s a lot of servers for the average IT shop. Being a lights out datacenter we have 3 management paths to every server IP-KVM, ILOM, and serial ports. Going through and setting all that up would have been a pain in the but, so I did a little searching and found how to configure the SUN ILOM via the serial port. With that document and little experimentation I quickly had my script, now all that was left was to learn how to script via a COM port. I turned to BING and found this article which pointed me to a new-to-me .Net Class… about 4hrs later I had a complete solution, and yet another example of the Admin Development Model.
First let me say, I love VCS, it took all of the complexity out of using NetApp storage in a vSphere environment. I have been tolerating one annoyance for quite some time now, and this morning said annoyance broke VCS at a customer site. What’s wrong with VCS? Well, for some reason it forces you to register the plugin with vCenter using an IP address. Due to an over-restrictive proxy configuration, which caused only fully qualified domain names(FQDN) worked. Any IP address was redirected to an web page that explained said over-restricted policy, because VCS is mainly a web page the use of an IP address broke everything. I searched around a little, and found Williams Lams post on removing plug-ins with the MOB. Once I found the pivot point for Plug-ins, I searched the API Reference, and found the ExtensionManager object. Now that I had the Object in hand, I fired up PowerCLI and in less than 10 min figured out how to manually adjust the URL VSC used. It was so easy that I think I’m going to try and slap together a quick module to manage plug-ins via PowerCLI, but in the meantime if you, like me, have been frustrated by VSCs use of an IP address… try this.