Home Orchestrator

Question about Run .net Script Activity

T_R_Ash_McCan_1T_R_Ash_McCan_1 Member IT Monkey ✭

Hi everyone, I have a loaded question concerning Orchestrator. I configured my orchestrator environment to be able to use the latest version of powershell in .net script activities (detailed here: http://get-cmd.com/?p=3731) and this all seems to work great. I recently set up a SQL table on a SQL server that I would like to query from using the module named SqlServer.

I am able to obtain the data I expect when running the commands directly in powershell, but am having difficulty getting the data to return back to Orchestrator and storing it in a variable.

Here’s a sample of a query for the Job Code WS43:


Those values would be used to determine if a particular system access is to be granted or not. Basically, if the value returned is “Skip” then that Manual Activity status would be set to Skipped. If not, then no status change takes place.


My .net script activity is basically just taking the data and passing it through to the qualifier.


If you look at the previous picture, you’ll see I pipe out the catch to a text file. All I keep getting in the text file as my exception is “Read-SQLTableData is not known as a valid command.” I’ve installed the SqlServer module on the Orchestrator server, and also on the SQL server housing the SQL table I am referencing.

So, I guess my question is (after all this rambling explanation) where is this .net script being run from? I would think as an Invoke-Command, it would be run from the SQL server. Outside of that, I would think it would be run from the Orchestrator server. Both of which have the module installed.

Hopefully I’ve made some kind of sense here, and if not, I’ll gladly clarify anything.

Best Answer

  • Brett_MoffettBrett_Moffett Cireson PACE Super IT Monkey ✭✭✭✭✭
    Accepted Answer

    Hi @T_R_Ash_McCan_1

    PowerShell scripts can be called from a file on the local OS rather than have the script in the .Net Script activity within Orchestrator. This way Orchestrator calls PowerShell on the server and runs with the latest version of PowerShell.

    When calling then Run .Net Script activity within Orchestrator it calls an older 32 bit version of PowerShell that can cause issues. Your script starts by importing the SQL module (Import-Module -Name SqlServer) this will load the library that is installed on the server but the PowerShell version may be preventing it from loading. The latest version or Orchestrator fixes this issue and there is a way to force it to use 64 bit PowerShell also.

    There are some good blog articles on this topic: PowerShell & System Center Orchestrator - Best Practice Template - TechNet Articles - United States (English) - TechNet Wiki (microsoft.com)

    More detail on the SQL PowerShell module can be found here: https://docs.microsoft.com/en-us/sql/powershell/sql-server-powershell

    As for the idea of running the whole decision tree from a PowerShell script, that will be more efficient, however, if you are moving to all PowerShell then you are better off using the Cireson PowerShell activity rather than calling Orchestrator at all. This is much faster and does not rely on having an Orchestrator infrastructure at all. BUT... if you have Orchestrator because your organisation does not have the required level of PowerShell skills, then leave it as it is and only use PowerShell for small key items like you have done in your example.

Answers

  • Justin_WorkmanJustin_Workman Cireson Support Super IT Monkey ✭✭✭✭✭

    Not to side-step the entire question, but could you just call a ps1 script from the filesystem?

  • T_R_Ash_McCan_1T_R_Ash_McCan_1 Member IT Monkey ✭

    Hi Justin, no worries, sidestep away. :) I'm good with any solution that works.

    Just to make sure I'm hearing you right, are you proposing using ps1 scripts in place of the big SCO spider arm activities? Like a big ps1 with SMlets to take care of this instead?

  • Brett_MoffettBrett_Moffett Cireson PACE Super IT Monkey ✭✭✭✭✭
    Accepted Answer

    Hi @T_R_Ash_McCan_1

    PowerShell scripts can be called from a file on the local OS rather than have the script in the .Net Script activity within Orchestrator. This way Orchestrator calls PowerShell on the server and runs with the latest version of PowerShell.

    When calling then Run .Net Script activity within Orchestrator it calls an older 32 bit version of PowerShell that can cause issues. Your script starts by importing the SQL module (Import-Module -Name SqlServer) this will load the library that is installed on the server but the PowerShell version may be preventing it from loading. The latest version or Orchestrator fixes this issue and there is a way to force it to use 64 bit PowerShell also.

    There are some good blog articles on this topic: PowerShell & System Center Orchestrator - Best Practice Template - TechNet Articles - United States (English) - TechNet Wiki (microsoft.com)

    More detail on the SQL PowerShell module can be found here: https://docs.microsoft.com/en-us/sql/powershell/sql-server-powershell

    As for the idea of running the whole decision tree from a PowerShell script, that will be more efficient, however, if you are moving to all PowerShell then you are better off using the Cireson PowerShell activity rather than calling Orchestrator at all. This is much faster and does not rely on having an Orchestrator infrastructure at all. BUT... if you have Orchestrator because your organisation does not have the required level of PowerShell skills, then leave it as it is and only use PowerShell for small key items like you have done in your example.

  • T_R_Ash_McCan_1T_R_Ash_McCan_1 Member IT Monkey ✭
    edited May 27

    Hi @Brett_Moffett Thanks for the response! I've been messing around with what you sent me for a little while now, and I think I have two issues, and I don't think they're related. First things first, I created a testing .net script activity in SCO so I can just make changes on a whim. I molded the script inside the article you sent me, and here's what I have right now:

    # Set script parameters from runbook data bus and Orchestrator global variables

    # Define any inputs here and then add to the $argsArray and script block parameters below 


    $DataBusInput1 = "WS43"


    #-----------------------------------------------------------------------


    ## Initialize result and trace variables

    # $ResultStatus provides basic success/failed indicator

    # $ErrorMessage captures any error text generated by script

    # $Trace is used to record a running log of actions

    $ResultStatus = ""

    $ErrorMessage = ""

    $Trace = (Get-Date).ToString() + "`t" + "Runbook activity script started" + " `r`n"

        

    # Create argument array for passing data bus inputs to the external script session

    $argsArray = @()

    $argsArray += $DataBusInput1


    # Establish an external session (to localhost) to ensure 64bit PowerShell runtime using the latest version of PowerShell installed on the runbook server

    # Use this session to perform all work to ensure latest PowerShell features and behavior available

    $Session = New-PSSession -ComputerName localhost


    # Invoke-Command used to start the script in the external session. Variables returned by script are then stored in the $ReturnArray variable

    $ReturnArray = Invoke-Command -Session $Session -Argumentlist $argsArray -ScriptBlock {

      # Define a parameter to accept each data bus input value. Recommend matching names of parameters and data bus input variables above


      Param(

        [ValidateNotNullOrEmpty()]

        [string]$argsArray

      )


      # Define function to add entry to trace log variable

      function AppendLog ([string]$Message)

      {

        $script:CurrentAction = $Message

        $script:TraceLog += ((Get-Date).ToString() + "`t" + $Message + " `r`n")

      }


      # Set external session trace and status variables to defaults

      $ResultStatus = ""

      $ErrorMessage = ""

      $script:CurrentAction = ""

      $script:TraceLog = ""


      try 

      {

        # Add startup details to trace log

        AppendLog "Script now executing in external PowerShell version [$($PSVersionTable.PSVersion.ToString())] session in a [$([IntPtr]::Size * 8)] bit process"

        AppendLog "Running as user [$([Environment]::UserDomainName)\$([Environment]::UserName)] on host [$($env:COMPUTERNAME)]"

        AppendLog "Parameter values received: DataBusInput1=$argsArray"


        # The actual work the script does goes here

        AppendLog "Doing first action"

        $JobCodeAction = (Read-SqlTableData -ServerInstance "sqlboxnamehere" -DatabaseName "JobCodeRBAC" -SchemaName "dbo" -TableName "JobCodeRBAC" | Where-Object {$_.JobCode -eq "WS43"}).ActiveDirectory


        # Do-Stuff -Value $DataBusInput1

        <#

        # Simulate a possible error

        if($argsArray[0] -ilike "*bad stuff*")

        {

          throw "ERROR: Encountered bad stuff in the parameter input"

        }

        #>

        # Example of custom result value

        $myCustomVariable = $JobCodeAction


        # Validate results and set return status

        AppendLog "Finished work, determining result"

        $EverythingWorked = $true

        if($EverythingWorked -eq $true)

        {

          $ResultStatus = "Success"

        }

        else

        {

          $ResultStatus = "Failed"

        }

      }

      catch

      {

        # Catch any errors thrown above here, setting the result status and recording the error message to return to the activity for data bus publishing

        $ResultStatus = "Failed"

        $ErrorMessage = $error[0].Exception.Message

        AppendLog "Exception caught during action [$script:CurrentAction]: $ErrorMessage"

      }

      finally

      {

        # Always do whatever is in the finally block. In this case, adding some additional detail about the outcome to the trace log for return

        if($ErrorMessage.Length -gt 0)

        {

          AppendLog "Exiting external session with result [$ResultStatus] and error message [$ErrorMessage]"

        }

        else

        {

          AppendLog "Exiting external session with result [$ResultStatus]"

        }

         

      }


      # Return an array of the results. Additional variables like "myCustomVariable" can be returned by adding them onto the array

      $resultArray = @()

      $resultArray += $ResultStatus

      $resultArray += $ErrorMessage

      $resultArray += $script:TraceLog

      $resultArray += $myCustomVariable

      $JobCodeAction | Out-File -FilePath "\\scobox\d$\temp\writeback.txt" -Force -Append

      $resultArray | Out-File -FilePath "\\scobox\d$\temp\writeback.txt" -Force -Append

      return $resultArray  

       

    } #End Invoke-Command


    # Get the values returned from script session for publishing to data bus

    $ResultStatus = $ReturnArray[0]

    $ErrorMessage = $ReturnArray[1]

    $Trace += $ReturnArray[2]

    $myCustomVariable = $ReturnArray[3]


    # Record end of activity script process

    $Trace += (Get-Date).ToString() + "`t" + "Script finished" + " `r`n"


    # Close the external session

    Remove-PSSession $Session


    Ok, so, I only have one piece of data that should ever go into the $argsArray, and that's the job code, a format like "AB12" as seen towards the top. Now, when I run the script, I publish the argsArray, just to make sure I am seeing the job code I expect:

    This is exactly what I expect to see. Good stuff.

    Now, somewhere along the way inside the Invoke-Command, I seem to lose it. I put in a couple Out-File lines BEFORE the data is returned, just to make sure the "return" isn't causing issues.

    Here's the AppendLog as seen from my Out-File:

    Success


    5/27/2021 11:07:18 AM Script now executing in external PowerShell version [5.1.17763.1971] session in a [64] bit process 

    5/27/2021 11:07:18 AM Running as user [domain\CorrectSCOAccount] on host [scobox] 

    5/27/2021 11:07:18 AM Parameter values received: DataBusInput1= 

    5/27/2021 11:07:18 AM Doing first action 

    5/27/2021 11:07:18 AM Finished work, determining result 

    5/27/2021 11:07:18 AM Exiting external session with result [Success] 


    So, sorry for the long post. It looks like my two current issues/questions are:

    1. my argument list shows data/content when going into the invoke command, but shows null inside the invoke command ##This log entry## Parameter values received: DataBusInput1=

    2. i still don't think it can query the remote sql table in the invoke command, i've even hard coded the entire command to take variables out of the equation:


    @Brett_Moffett again thank you for at least helping me out this far. Much appreciated!

  • T_R_Ash_McCan_1T_R_Ash_McCan_1 Member IT Monkey ✭

    Ok, follow up - i FINALLY got this to work.

    I had to make two changes to the script above:

    BEFORE: $Session = New-PSSession -ComputerName "localhost"

    AFTER: $Session = New-PSSession -ComputerName "sqlbox w/ the table being queried"


    BEFORE (w/ hard code as shown above):   $JobCodeAction = (Read-SqlTableData -ServerInstance "sqlboxnamehere" -DatabaseName "JobCodeRBAC" -SchemaName "dbo" -TableName "JobCodeRBAC" | Where-Object {$_.JobCode -eq "WS43"}).ActiveDirectory

    BEFORE (w/ incorrect argarray variable): $JobCodeAction = (Read-SqlTableData -ServerInstance "DBGBSQ1-DEV" -DatabaseName "JobCodeRBAC" -SchemaName "dbo" -TableName "JobCodeRBAC" | Where-Object {$_.JobCode -eq $argsArray[0]}).ActiveDirectory

    AFTER: $JobCodeAction = (Read-SqlTableData -ServerInstance "DBGBSQ1-DEV" -DatabaseName "JobCodeRBAC" -SchemaName "dbo" -TableName "JobCodeRBAC" | Where-Object {$_.JobCode -eq $argsArray}).ActiveDirectory


Sign In or Register to comment.