NetApp PowerShell Toolkit 101: Managing Data Access

Over the last several posts we have reviewed how to create and manage aggregates, SVMs, and volumes. All of that is great, but at this point you still can’t access that capacity to begin storing things. In this post we will discuss the various ways to access the volumes and the data inside them.

  • Junctioning
  • Export Policies
  • NFS Exports
  • CIFS/SMB Shares
  • LUNs
    • LUN Management
    • iGroups
    • LUN Mapping

Read moreNetApp PowerShell Toolkit 101: Managing Data Access

NetApp PowerShell Toolkit 101: Managing Volumes

Volumes are the containers of data in a NetApp storage system. They are “stored” on aggregates, accessed via Storage Virtual Machines, and are the point-of-application for many of the features of Data ONTAP. Let’s look at what we can do with volumes leveraging the PowerShell Toolkit:

  • Creating, Deleting, and Re-sizing Volumes
  • Volume Features
    • Thin Provisioning
    • Deduplication
    • Compression
    • AutoGrow / AutoShrink
    • Fractional Reserve
    • Quality of Service
  • Volume Options
  • Snapshots
  • FlexClones
  • Volume Move

Read moreNetApp PowerShell Toolkit 101: Managing Volumes

NetApp PowerShell Toolkit 101: Storage Virtual Machine Configuration

Storage Virtual Machines (SVM) are the entity in clustered Data ONTAP which the storage consumer actually interacts with. As the name implies, they are a virtual entity, however they are not a virtual machine like you would expect. There are no CPU, RAM, or other cache assignments that must be made. Instead, we assign storage resources to the SVM, such as aggregates and data LIF(s), which the SVM then uses to provision FlexVols and make them available via the desired protocol.

In this post we will look at how to configure an SVM using PowerShell.

  • Create an SVM
  • Aggregate Access
  • SVM DNS Service
  • Configuring Data LIF(s)
  • Configuring Protocols

Read moreNetApp PowerShell Toolkit 101: Storage Virtual Machine Configuration

NetApp PowerShell Toolkit 101: Node Configuration

In the last post we looked at some settings that apply to the cluster. This time, let’s look at how to administer nodes.

In this post we will cover using the NetApp PowerShell Toolkit to manage these aspects of nodes:

  • Network Port Configuration
  • Node Management LIFs
  • Service Processor
  • CDP
  • Aggregates

Read moreNetApp PowerShell Toolkit 101: Node Configuration

NetApp PowerShell Toolkit 101: Cluster Configuration

Using the NetApp PowerShell Toolkit (NPTK) can sometimes be a daunting task. Fortunately, it is pretty intuitive on how to configure most aspects of your storage system. Let’s start by looking at some of the cluster level configuration items that can be managed using the NTPK.

In this post we will cover:

  • AutoSupport
  • Licenses
  • Cluster Management LIF(s)
  • Inter-Cluster LIF(s)
  • SNMP
  • DNS

Read moreNetApp PowerShell Toolkit 101: Cluster Configuration

NetApp PowerShell Toolkit 101: Getting Started

The NetApp PowerShell Toolkit (NPTK) is a great way to get started administering your NetApp resources, both 7-mode and clustered Data ONTAP (cDOT), in a more efficient and scalable manner.

Getting the Toolkit

The download (version 3.2 at the time of this writing) is available from the NetApp Communities in the Microsoft Cloud and Virtualization board.

From the download page are two links to some great resources: the Getting Started presentation, and Making the Most of the NetApp PowerShell Toolkit. Both of these are excellent reads if you want some starting hints.

Getting Help

  • The NetApp Communities: The communities are a great place to get help quickly for any question you might have. I recommend that you use the Microsoft Cloud and Virtualization Discussions board, however the SDK and API board will infrequently have questions as well.You can also send me a message using the NetApp Communities. My username is asulliva, and I’m happy to respond to questions directly through the Communities messaging system.
  • From the NPTK itself: One of the less known features of the Toolkit is that it has help built in. Yes, you can use the standard Get-Help cmdlet, but there’s a hidden treasure: Show-NcHelp.This cmdlet will generate an HTML version of the cmdlet help and open your default browser to display it.

    2015-02-13 18_52_42-

    From here you can dig through the cmdlets and view all of the information you want to know about them quickly and easily.

A Few Basics To Get Started

Now that you have the toolkit and have installed it, it’s time to use it. Let’s look at a couple of basic tasks.

Note: I will be using the cDOT cmdlets, however nearly all of the commands have an equivalent available for 7-mode.

Connecting to a controller
Connecting to your cluster is extremely easy. You can specify the cluster management IP address, or any of the node management IPs as well. If you do not provide credentials as a part of the command invocation, it will prompt for them.

Getting Information
Now that we’re connected to the cluster, let’s take a look at some of the information that can be gathered:

Onward to Automation

There are a number of “PowerShell Toolkit 101” posts that introduce some of the possibilities. Be sure to read through these other posts:

This doesn’t even begin to scratch the surface of the NetApp PowerShell Toolkit. Anything that can be done from the command line can be done using the toolkit. If you’re interested in seeing specific examples, need help, or just have questions, please let me know in the comments!

Using REST + WFA Finders to Create Dynamic vCO Workflows

In the last post we covered how to create Filters and Finders in WFA so that we could access WFA data through a RESTful interface. This creates a nice separation between the two systems and decouples the dependency on the WFA database for dynamically populating data in vRealize Orchestrator workflows.

Let’s look at how to take the result of the last post, query the data from vRO, and incorporate it into vRO workflows.

I’m going to be using the same workflow as before, “Create a Clustered Data ONTAP Volume”, so we will once again need four inputs:

  • Cluster Name – A string with valid values being clustered Data ONTAP systems configured in WFA
  • Storage Virtual Machine Name – A string with valid values being SVMs belonging to the cluster selected above.
  • Volume Name – A string provided by the user.
  • Volume Size (in GB) – A number provided by the user.

To get started, we are going to create vRO actions which execute REST operations against the WFA filters/finders to return the same data that we used direct SQL queries for previously. These actions will then be executed/used from the workflow presentation.

Read moreUsing REST + WFA Finders to Create Dynamic vCO Workflows

Getting data from NetApp Workflow Automation using Finders and REST

Using the database to get information from Workflow Automation (WFA) and create dynamic vCenter Orchestrator (vCO) workflows is one way to add dynamic data fields to those workflows. However, it just feels dirty. It’s a “backdoor” if you will, and just not very scalable or supportable. Imagine if the WFA database schema changes…you will be responsible for changing all of the SQL queries in the vCO workflows, which make break in non-obvious ways.

A much more robust method is to abstract those queries (and keep them in WFA) then use REST to retrieve the data. WFA provides two mechanisms, filters and finders, for selecting and returning data from the database internally. We can access these through the REST interface, which we can then parse from XML into a more vCO friendly format.

What is a filter?

A filter is simply a SQL select statement that has been validated to return certain fields (the natural keys at a minimum).

What is a finder?

A finder is one or more filters.

Putting them to work

Both of these constructs use SQL to query the WFA cache database (which is periodically updated from the data sources such as OnCommand Unified Manager), however a finder does not have SQL directly in it, only the filter does.

Read moreGetting data from NetApp Workflow Automation using Finders and REST

Installing Modules to WFA Perl

NetApp’s Workflow Automation (WFA) supports two languages out-of-the-box: Powershell and Perl. Adding modules to the perl installation is done in a non-obvious way because the install does not include ActiveState’s PPM package manager.

However, the PPM command line utility is included. Here is how to use it to manage the packages on your system.

First, you will need to open a command prompt with elevated privileges. Click the start button, find “cmd”, right click and select “Run as Administrator”.

Using the vCO-to-WFA database connection

Previously we connected vCenter Orchestrator (vCO) to the NetApp Workflow Automation (WFA) database in order to perform queries against it. By itself this isn’t terribly useful when we are wanting to provide dynamically populated information to vCO workflows that are executing WFA workflows using REST.

The crux of the problem we are trying to address is that when executing a WFA workflow via REST we are not able to pre-determine valid values for inputs like the cluster or storage virtual machine names. The administrator(s) can provide static values, but this is only helpful with a subset of inputs (how frequently do you add a new cluster?). For volumes, which have the potential to be added and removed frequently this would become a burdensome task quickly.

One answer (there are others, which I will post about in the future) is to use the database. When a WFA workflow is executed natively (i.e. from WFA) it uses query based fields to determine those inputs. The data is populated by crafting SQL commands to pull the data form WFA’s cache database.

Read moreUsing the vCO-to-WFA database connection