Executing WFA workflows from vCenter Orchestrator using REST

In the previous post I showed how to execute a NetApp OnCommand Workflow Automation (WFA) workflow using the REST cmdlets available in Powershell version 4. However, any language or platform can be used for execution via REST, including VMware’s vCenter Orchestrator (vCO).

For the most simple execution we can simply add the WFA host as a REST host to vCO. When the REST plug-in is added to your vCO instance, it adds some helper workflows for managing the connected hosts. Let’s start by executing the “Add a REST host” vCO workflow (located at Library->HTTP-REST->Configuration).

The first screen prompts for a friendly name this host is known by and the URL. Be sure to add the /rest to the end of the hostname/IP address when you provide the URL.


If you need to configure proxy settings, do so on the next screen. For WFA, we will want to use “Basic” authentication, on the third screen select this option, which will enable a fourth screen to provide credentials.


We will want to use the “Shared Session” mode (which means that the same set of credentials will always be used), then provide your WFA username and password. When finished, click “Submit”.


Assuming everything works, you should see the completed execution of the workflow, and browsing to the Inventory tab, then expanding the HTTP-REST option will show the newly created host.



What can we do with this now? Well, we can use the REST host object inside of a vCO script to replicate what we did using Powershell. Because the REST host properties are stored in the vCO object, the password does not have to be exposed, and we can now programatically access it when needed.

Let’s get started by creating a new vCO workflow. For convenience, I’ve chosen to name it the same as the WFA workflow.


Once it’s created, add a single parameter. The type should be REST:RESTHost and the value will be the REST host created above (which I have called “WFA”).


Now we need to add inputs to the vCO workflow. These will be completed by the user and passed to the WFA workflow. I chose to name the vCO inputs the same as the WfA inputs, but that is optional at this stage. For this example, I have three string inputs (ClusterName, VserverName, and VolumeName) and one number input (VolumeSizeInGB).


Change to the schema view, where we only need to add a single element…a scriptable task.


Edit the scriptable task, select the “Visual Binding” tab. Drag-and-drop each of the four inputs and the single attribute so that they are inputs to the scriptable task. It should look something like the screenshot below.


Switch to the script tab for the task, and copy/paste the script below:

Note that the content of the javascript here is very similar to the Powershell from the last post. We are, at the core, performing three REST operations against WFA:

  1. Query WFA for the workflow UUID using a GET operation
  2. Execute the workflow, passing the parameters as XML, by issuing a POST command to the workflow specific URL
  3. Monitor the status of the execution, by issuing a REST GET operation, for failure or completion

The final step is to edit the presentation to pretty it up a bit. This is also where we would do validation of the inputs if desired. For the three string inputs (ClusterName, VserverName, and VolumeName), I have simply made them mandatory. For the number input (VolumeSizeInGB), I added a minimum and maximum value which should represent sane values for your environment.



When we’re all done, let’s test by switching to the schema view and clicking the “Debug” button. We are using the debug to test because in the scriptable task’s code we used the command System.debug to send log messages. These are only visible to us using the debug execution.

Here is what the screen looks like:


After hitting submit, assuming the values provided are valid for the WFA workflow, you should see something like the following:


We see on the right side the messages from the script which indicate that it was able to execute the workflow and about how long it took to complete. I have chosen not to log where the volume was created (node and aggregate), as I did in the Powershell script, because it will not be seen. However, if desired, you could modify the the workflow to return those values. This would be particularly helpful if you are wanting to use this workflow as a part of a larger chain, for example, one which performs operations for deploying an entire application requiring a dedicated volume.

Questions and comments are always welcome below. If you have any suggestions for, or need help with, NetApp/VMware integration, automation, and solutions, please let me know using the comments!

Leave a Reply