jmbrinkman

Archive for the ‘Powershell’ Category

Running in the name of…

In Active Directory, Chef, DevOps, Powershell, Race Against the Machine on February 14, 2015 at 22:28

If you like my content please do check out my new blog at thirdpartytools.net ! 

 

http://reunion.la1ere.fr/2014/09/26/not-my-name-974-les-musulmans-de-la-reunion-disent-non-l-etat-islamique-193110.html

During my last post I briefly discussed the issues related to the execution context of the chef-client when developing and testing cookbooks. To summarize: on a windows machine the chef-client runs under the context of the local system account, an account which basically is a local admin but has almost no permissions on other Active Directory joined/secured objects ( such as other machines, Active Directory itself or file shares).

You can always mimic this behaviour by just adding a cookbook to a test node’s runlist and restarting the chef-client service of course. Or create a scheduled task which runs chef-solo or zero. However that leaves you without the direct output from an interactively started chef-run. Also I can’t really see how you could add something like that to a vagrant setup.

Luckily – the need to run stuff under the local system account is not new. Quite some time ago the windows tool legend Mark Russinovich developed a tool which also you to, amoungst other things, to start a process as the local system account. In order to do that it temporarily installs a windows service called PSEXESVC which is used to start a process ( eg. cmd.exe) as the local system account.

The command itself is very simple:

somepath:\psexec /s cmd.exe

You can run this from another command window, or Powershell – doesn’t really matter. Once you do it starts another instance of cmd.exe and everything executed from that instance will be run as the local system account. For instance :

somepath:\whoami

Will return:

nt authority\system

From there you can start chef-client just as you normally would – however you will find out that if you access anything outside of the node that access will be denied – unless the permissions on the resource are set to include the computer account object ( which includes Everyone and Authenticated Users btw). Also the output of the chef-client is a little bit different – more like the output that’s written to the chef log when you run the client as a service.

I’ve tried hacking this into the batch files chef uses to start stuff ( so instead of calling ruby.exe directly I would first open a command console as local system when running chef zero) but that didn’t seem to work – I guess that has something to with the difference between a real interactive session and a session as started by psexec. But I’m sure I’ll find some way to work around that at some point.

But for now – if you really want to know if your cookbook will run as you’d expect in a Windows domain environment – use psexec manually at least once in order to identify any permission issues.

 

P.S – there is one other fun way to run stuff as local system – replacing the ease of access executable on a box 😉 For more details see Guy’s site.

Advertisements

Movies As Code

In Powershell, Proxy on March 12, 2012 at 21:12

If you like my content please do check out my new blog at thirdpartytools.net ! 

 

A friend showed me this wonderfully geeky site: http://moviesascode.net/

The whole idea is to describe movies or movie titles using runnable code! And I’m glad to say that I contributed the first Movie as Powershell : The Sum of All Fears!

I generously stole the basics from Wayne and used powershell to sum up the number of times the string “fear” appears in the King James Bible, times the number of characters in the word “fear”.

The proxy stuff is quite interesting because I had been searching for a code snippet that allowed me to authenticate to ISA/TMG for a while but I guess I needed a touch of silliness to come up with the right search terms. Here’s my adaption of Wayne’s code:

$proxy= Read-Host “Proxy? Yes/No?”

if ($proxy -eq “yes”)
{
$user= $env:USERNAME
$webproxy = Read-Host “Proxy address? (like http://your.proxy.server:8080)”
$pwd = Read-Host “Password?” -AsSecureString
$proxy = New-Object system.Net.WebProxy
$proxy.Address = $webproxy
$account = new-object System.Net.NetworkCredential($user,[Runtime.InteropServices.Marshal]::PtrToStringAuto([Runtime.InteropServices.Marshal]::SecureStringToBSTR($pwd)), “”)
$proxy.credentials = $account
}

It’ll ask you if you want to use a proxy and if you do then it’ll ask you for a password (it uses the currently logged on user).

Some more code:

Idioms

99 Bottles of Beer

Poetry

This blog has been moved to: http://multiplechoicesystemsengineer.nl/Lists/Posts/Post.aspx?ID=2

Mini-Review: Monitoring vSphere with SCVMM and SCOM 2012

In Powershell, SCOM 2012, SCVMM 2012, System Center, Virtualization, Vmware on November 7, 2011 at 23:03

If you like my content please do check out my new blog at thirdpartytools.net ! 

Sometime ago I posted my Vsphere monitoring shoot-out. I recently had the time to install the RC of the SCVMM 2012 and the beta of SCOM 2012. There are plenty of guides out there that describe how to get you started with both products ( SCOM 2012 beta in ten minutes , SCOM 2012 Beta step by step, SCVMM 2012 Survival Guide ) so I won’t get into that to much. Some general remarks:

SCVMM

  • You need the Windows 7 AIK which is only downloadable as an ISO or IMG. That annoyed me.
  • I used SQL 2008 R2 Express as a database – in hindsight it would have been better to use a full SQL trial and host both SCVMM and SCOM’s databases.
  • Besides that the install was quick and painless

SCOM

  • Collation, Collation, Collation! Choose SQL_Latin1_General_CP1_CI_AS as your SQL collation otherwise SCOM won’t find your SQL instance and it will not tell you you picked the wrong collation.
  • You need .NET 4
  • I had some issues installing the SCOM agent on the SCVMM server. I got this error:

Log Name:      Application

Source:        MsiInstaller

Date:          4-11-2011 17:53:33

Event ID:      1013

Task Category: None

Level:         Error

Keywords:      Classic

User:          ****\****

Computer:      FQ.DN

Description:

Product: System Center Operations Manager 2012 Agent — Microsoft ESENT Keys are required to install this application.  Please see the release notes for more information.

Apparantly this is not a SCOM 2012 specific error but more a general SCOM error on Windows 2008 R2 boxes. Running msiexec from an elevated command prompt solved the problem.

Adding vSphere to SCVMM

This part is pretty straightforward as well. Open the Virtual Machine Manager Console, Fabric pane and choose Add ResourceVmware vCenter Server. Create a Run As account which has enough the required privileges (local admin on the vCenter server according to Technet). After you’ve added the vCenter server you need to each Resource Cluster (or individual host) as well in much the same way as you added the vCenter server. But since you’re already connected to vCenter you don’t have to enter RC or host names – you can just select them in a browsing dialog.

Strangely enough I wasn’t able to retrieve and accept the certificate for any of my hosts using a domain account – which does have root equivalent privileges on the hosts – but either the AD integration is flawed or I made a mistake configuring it. But I used a second Run As account using the default vSphere root account and I was able to retrieve and accept the certificates.

After that I was able to view all my hosts and vm’s in SCVMM. Same goes for templates and host networking. SCVMM even sees my dvSwitches and sees them as one entity – but same goes for my vSwitches…which is not really what I would like to see. Portgroups aren’t shown in the networking pane – but I was able to find them in the vm guest properties. I did a quick test to see if I could actually manage stuff – and I could but for now I’m more interested in monitoring vSphere I’ll get down to managing vSphere some other time.

Connecting SCVMM to SCOM

I followed this great post on the SCVMM blog to connect SCVMM to SCOM. Most notable improvement over the previous versions: no need to install the VMM console on the SCOM server. However you still need to install the SCOMsole on the VMM Server. Oh and creating the connection is now a simple wizard in the VMM console :). I had some issues with not being able to search the online SCOM catalog I needed to download the prequisite MP’s by hand.

Once I got that sorted out I completed the wizard and the connection was made.

And? Has it gotten any better?

Yes. Because vSphere and vCenter are represented just as vSphere and vCenter in both SCVMM and SCOM instead of weird vm’s on a mutated Hyper-V server the visibility and navigation is much better. But my SCOMsole immediatly got filled up with alerts telling me my vm’s didn’t have VSG installed – and because everything is discovered through your VMM server (which is does still seem to see as a Hyper-V server) it started complaining about the fact that I had more then 384 vm’s on a host.

Alerts are also a lot quicker. Views are a bit poor – especially when you consider that the way my vSphere datacenter hierarchy is displayed in SCVMM is pretty good. The fact that SCOM and SCVMM will allow me to view a diagram of a service as defined in SCVMM look really promising but I haven’t tested that yet. If you put a host into maintenance mode in SCVMM its status is automatically propagated to SCOM. There is still no link between the vm as an instance running on vSphere and the Windows computer object in SCOM – that’s a real shame.

There isn’t a lot of Vmware specific stuff there as well. I guess that remains as MS likes to call it a partner opportunity – or something you could develop yourself using vCenter and System Center’s common denominator Powershell. But I believe even that might be less of a challenge then before because of the improved SNMP support in SCOM 2012 (so you can just that in addition to the information exposed by vCenter). Still the biggest improvement seems to be on the managing side rather then on the monitoring side – which makes taking the monitoring shortcomings for granted much more plausible then before.

Backup TMG configuration using Powershell

In Powershell on September 23, 2011 at 13:08

Unfortunatly TMG doesn’t ship with any specific Powershell cmdlets. However, using COM objects you can export/backup up the TMG (or ISA) configuration to a xml file.

Depending on your environment there are two options, if you have an Enterprise Array use this:

$root= New-Object -ComObject “FPC.Root”

$root.Exporttofile(“[PATH_AND_Filename]”,”0″)

If you have an standalone array use this instead:

$root= New-Object -ComObject “FPC.Root

$array=$root.GetContainingArray()

$array.Exporttofile(“[PATH_AND_Filename]”,”0″)

To give an example, this what a typical script to backup TMG will look like:
$array=$root.GetContainingArray()
$array.exporttofile(“d:\tmgbackup.xml”,”0″)
if ($err)
    {
    write-eventlog -logname Application -source TMGBackup -eventid 9999 -entrytype Warning -message “Backup
failed, cause: $err” -category 0
    }
else
{
write-eventlog -logname Application -source TMGBackup -eventid 9000 -entrytype Information -message “Backup Succeeded” -category 0
}

You should of course first register the eventlog source using new-eventlog to register the TMGBackup eventlog source.

Running Powershell script as SCOM console task and passing named parameters

In Operations Manager, Powershell, System Center on May 19, 2011 at 09:54

We have a ticketing system for which there is no SCOM connector and we wanted to provide a simple way to forward an alert to the ticketing system by email. We already stumbled upon the Alert Forward Task MP by Cameron Fuller but to add some flexibility I decided to rewrite it using a powershell as the application that is being executed. Contrary to agent task there is no default functionality to specifically run a powershell script to it took some time to figure out how I should call the script from the task and how to pass the needed parameters.

The variables I wanted to get from the alert where the MonitoringObjectName,Name,Description,Severity and Time Raised. I would then use those variables as named paramters to the powershell script that actually sends the mail. This is the code for the powershell script:

Param($managedobject,$name,$description,$time,$severity)
# Variables
$recipient= someone@someone.local
$mailserver = mail@someone.local
$sender= someoneelse@someone.local
$body=@"
<html>
<meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1">
<body>
<h1>$name<br></h1>
<b>Managed Object:</b>$managedobject<br>
<b>Description:</b> $description<br>
<b>Time Raised:</b> $time<br>
<b>Severity:</b> $severity
</body>
</html>
"@
$subject="Operations Manager Alert:"+"$name"
# Let's send an email
Send-MailMessage -ErrorVariable $mailerror -From $sender -To $recipient -SmtpServer $mailserver -Subject $subject -Body $body -BodyAsHtml
# If you enable output on the console task use the clause below to give some output.
#"@
#$message=@"
#The alert: $name ,
#has been forwarded.
#"@
#if ($mailerror)
#    {
#    Write-Host $mailerror
#    }
#else
#    {
#    Write-Host $message
#    }

As you can see it is a rather simple script that takes the input from the alert, builds an html message and uses Send-MailMessage to send the message. If you’d like to show output (error, success) you can uncomment that section and set RequireOutput to true in the XML.

Then I created a console task in the SCOM Authoring console and specificed a command line application, the paramters and working directory. Of course it took me some time to find the right syntax and looking at the xml I noticed something peculiar, all the parameters you enter in the gui are put into one <parameter> element inside the XML. Even when you edit the xml by hand and add each parameter as a separate element and open up the pack in the console and change something like the display name and save it it wraps them all up in one element again. Testing showed that with all the arguments in one element in the XML the task doesn’t work.

Here is the XML snippit of the console task:

<ConsoleTask ID="MCSE.AlertForward" Accessibility="Public" Enabled="true" Target="System!System.Entity" RequireOutput="false" Category="Alert">
<Application>%windir%\System32\WindowsPowerShell\v1.0\powershell.exe</Application>
<Parameters>
<Parameter>-noprofile</Parameter>
<Parameter>"&amp; \\someuncpath\forward-alert.ps1"</Parameter>
<Parameter>-ManagedObject '$MonitoringObjectName$'</Parameter>
<Parameter>-Name '$Name$'</Parameter>
<Parameter>-Description '$Description$'</Parameter>
<Parameter>-Severity '$Severity$'</Parameter>
<Parameter>-Time '$TimeRaised$'</Parameter>
</Parameters>
<WorkingDirectory>\\someuncpath\</WorkingDirectory>
</ConsoleTask>

Easiest way to add this task would be to copy/paste the xml into an existing MP and import the MP into your SCOM environment.

I put the powershell script on a share accessible to all our SCOM operators, used powershell.exe as the application and the script path and the variables from SCOM as arguments.Notice the double quotes around the script path and the single quotes around the alert variables.

You can of course create a more elaborate email, for instance using this excellent script by Tao Yang as an example. (Tao creates his own channel to send email notification and set up subscriptions but the code used to collect the data from SCOM can also be used in a console task).