About…

Mini-Blog: Creating HTML files from CSV

We use a documentation system that only allows CSV exports, which gets annoying when tyring to supply clients with some form of data out of it. To resolve I rewrote the script by Brandon Everhardt to take all CSV files from a folder and export them into a single readable HTML file. It’s a very quick and dirty script, using the ConvertTo-HTML function to make a somewhat more readable approach.

The result looks like this:

 

Screenshot of HTML result

To create the HTML file, here is the very quick and dirty code:

$css = @"


<style>
h1, h5, th { text-align: center; font-family: Segoe UI; }
table { margin: auto; font-family: Segoe UI; box-shadow: 10px 10px 5px #888; border: thin ridge lime ; }
th { background: #0046c3; color: #fff; max-width: 400px; padding: 5px 10px; }
td { font-size: 11px; padding: 5px 20px; color: #000; }
tr { background: #b8d1f3; }
tr:nth-child(even) { background: #dae5f4; }
tr:nth-child(odd) { background: #b8d1f3; }
</style>

"@ 

$csvs = get-childitem "C:\CSVs" -filter *.csv -Recurse
$outputfile = "C:\Temp\Document.html" 
foreach($csv in $csvs){ 
Import-CSV $csv.FullName | ConvertTo-Html -Head $css -Body "<h1>Filename: $csv</h1>" | Out-File $outputfile -Append 

}

Mini-blog: Wait for VM to come online, then execute Powershell direct

During my labbing I’ve noticed I often need to wait for the VM to get online before executing some script or commands, to do that I use the following script:

First we get the VM Name, and enter the credentials for that specific VM:

$VMName = read-host "Please enter the VM-Name"
$VM = get-vm $VMName
$cred = Get-Credential -Message "Please enter credentials for the VM"

By getting the VM status we can use the VM.Heartbeat to check the actual status. We’re currently counting on the VM to be an unmonitored VM so the result we expect is “OkApplicationsUnknown” – This means the OS is up but the hypervisor has no idea if the VM applications are “healthy”.

To wait for the VM to get online we use a while loop to keep checking the heartbeat and if the status changes.

do
{
$VM = get-vm $VMName
} while ($vm.Heartbeat -ne "OkApplicationsUnknown")

Directly after the loop we are sure the VM is online, so we can execute our script via PowerShell Direct, here I simply print the hostname. ๐Ÿ™‚

invoke-command -vmname $vmname -Credential $cred -ScriptBlock {write-host "My name is $env:COMPUTERNAME"}

Happy scripting!

App-V screenshot

Using App-v to deploy Office 2016 on RDS servers

Hi All!

At my current employeer we have a lot of clients that use RDS servers, in the SMB often you see that applications are installed directly on the RDS server. That means that if you have a Highly Available solution there often is a difference between applications installed.

To prevent this we use Microsoft’s App-V for RDS servers. App-V is included in your RDS Client Access Licenses so in essence its “free”. The cool thing about app-v is that by using application virtualisation you no longer have to install any application but simply publish them to the users or computers you want them to run.

To learn more about app-v you can check this technet page.

Microsoft has really embraced app-v as part of the OS and it is now installed by default on Server 2016 and Windows 10. To start with App-V we will create a package, and publish it on a RDS2016 server without using the rest of the app-v infrastructure. This is called a “Stand-Alone app-v deployment”

Requirements:

  • A preconfigured RDS 2016 server.
  • The Office 2016 Deployment toolkit found here
  • XML file with the features you want in your App-V package from here

First we will configure our RDS2016 to be able to run App-V packages, to do that run the following command as administrator within PowerShell

Enable-Appv
Set-AppvClientConfiguration -EnablePackageScripts $true
restart-computer

After restarting you can check the app-v status with the following powershell command

Get-AppvStatus

With the above commands we do not only enable App-V but also “package Scripting”. Packages with extended functionality such as office require the running on scripts within the app-v packages. The default setting forbids any scripts from running for security reasons.

After you see that App-v is enabled we can move on to creating the app-v package. To move on we download the Office Deployment Toolkit and our XML file to C:\ODT, we rename the XML file to “SETUP.XML” Then we open a CMD.EXE window as administrator and enter the following command:

C:\ODT\SETUP.EXE /PACKAGER C:\ODT\SETUP.XML C:\ODT\APPVPackage

The setup will now run and create a App-v package out of the selected office components, after the setup completes you can copy C:\ODT\AppvPackage to a location of your choice. I’d advise to collect all of your application packages on a DFS-R share but for the purpose of this lab the location does not truly matter.

After copying the files you execute the following Powershell command to make the package available for the RDS 2016 machine:

Add-AppvClientPackage -Path "<PATH TO YOUR APPV file>" | Publish-AppvClientPackage -Global | Mount-AppvClientPackage -Verbose

To only make the package available to a specific user you can execute the following powershell command as administrator:

Add-AppvClientPackage -Path "<PATH TO YOUR APPV file>" | Publish-AppvClientPackage

And to finish you simply mount the file as the end-user by running:

get-appvclientpackage -all | Mount-AppvClientPackage -Verbose

And tada; you’ve deployed office to the RDS without installing the actual software. You’re currently running Office in a virtual bubble. When running this in production I’d truly suggest looking into an App-V Publishing server to make management much easier. ๐Ÿ™‚

Free online powershell training

Hi all!

After a couple of weeks of silence I have some great news; I will be giving free online powershell courses for beginners and intermediates. Hopefully I’ll be able to assist some of you in questions you have about your own scripts, or scripts you’ve used from my blog.

The first course will be August 7 at 19:00 GMT+1. You can join the course by emailing me at Kelvin [at] Limenetworks.nl or via the following Skype for Business URL: Skype for Business Meeting

In the first course I will be focusing on using Powershell in your day-to-day operations and automating minor tasks. It’ll be hands-on as much as possible and not only focus on theory. there will be room for questions during the one hour course.

Hope to see you there!

Mini Blog: Checking processor performance

I haven’t been blogging alot lately, mostly due to renovating at home and having very large projects in the office. To compensate I’ve decided to write some quick mini blogs to make sure I don’t lose the magic ๐Ÿ™‚

I’ve found with some application monitoring I’ve been setting up it was required to make a quick snapshot of how heavy the processor was being used during my scripts, as the application made some SQL queries it could create spikes in the CPU that I wanted to avoid.

To take a quick snapshot of the current values of the processor status, I’ve used the get-counter cmdlet and retrieved the cooked value of this to query it further;

$CPUQueueLength = (get-counter -counter "\System\Processor Queue Length").countersamples.cookedvalue

$CPUUSerTime = (get-counter -counter "\Processor(*)\% User Time").countersamples.cookedvalue

$CPUPrivTime = (get-counter -counter "\Processor(*)\% Privileged Time").countersamples.cookedvalue

Happy scripting ๐Ÿ™‚

Mini-Blog: finding the windows search database location.

I haven’t been blogging alot lately, mostly due to renovating at home and having very large projects in the office. To compensate I’ve decided to write some quick mini blogs to make sure I don’t lose the magic ๐Ÿ™‚

One thing I’ve found is that my Windows Search EBD blog about the CoreCount registry key is one of the most visited blogs. Some people like automating defrag jobs or monitoring the size in their monitoring systems.

to quickly find the location of the Windows Search EBD, you can use the following Powershell v2+ command:

$CurrentLoc = Get-ItemProperty -Path "HKLM:\SOFTWARE\Microsoft\Windows Search\" -name DataDirectory

Happy Powershelling!

Adding branding to the Office 365 portal

Hi all,

Today we’re going to change the branding of the Office365 portal. Something that helps clients identify that they are logging into the correct portal and are entering their usernames correctly. The great thing about this is you can have your own support information directly on the office365 portal.

you should make notes of some prerequisites before you start:
Company branding is a feature that is available only if you are using the Premium or Basic edition of Azure Active Directory, or are an Office 365 user. To check if branding is available you can try the following link

If you’ve checked that you can use branding, and you have your Office 365/Azure Admin account available you first log in on the portal at https://portal.microsoftonline.com, If you’ve previously used Azure you can directly log into Azure using https://portal.azure.com

After logging in browse to the admin panel and click on the “Azure Active Directory” link in the sidebar. This will open the Azure Portal. In case you have never signed into the Azure portal you will be forced to fill in some minor information such as e-mail address and physical address.

When logged into the Azure Portal you can click on Azure Active directory in the sidebar. You will not be presented with the following screen, Click on the area i’ve marked as yellow;

Screenshot01
After clicking, a new sidebar appears and you can select the option to “Configure Company Branding Now”. You can upload your images directly and also change the login tips and support information. After uploading your images you can log out of the portal. When logging in again you will see the branding as soon as you’ve entered your e-mail address or when you log into the portal using the URL https://outlook.com/tenantname(e.g. outlook.com/constoso.onmicrosoft.com). If you want users to access the webmail or portal using their own domain-name my advise would be to create a forward from webmail.clientname.com to https://outlook.com/tenantname.

An example of the branding can be found below;

A screenshot of the branding applied
A screenshot of the branding applied

Happy branding! ๐Ÿ™‚

Using powershell to create a new service

One of our clients has an application which always needs to run in the background, in the past the client resolved this by running the application after logon. Of course this meant after we perform maintenance the application would no longer work unless we’d log the user on. Autologon is not advised due to leaving a user logged in at the console.

Powershell has a neat function that can create a service from any executable, of course this does not mean you should try every application, but applications that require no input at all can be installed quite easily.

To create a new service you can use the following code, while running PowerShell as an administrator.

New-Service -BinaryPathName <EXECUTABLEPATH> -Name <NAMEOFTHESERVICE> -Credential <CREDENTIALSUSED> -DisplayName <DISPLAYNAME> -StartupType Automatic 

Of course this means that if we want to create a local system service of C:\Application.exe we can do the following:

$credentials = new-object -typename System.Management.Automation.PSCredential -argumentlist "NT AUTHORITY\LOCAL SYSTEM"
New-Service -BinaryPathName C:\Application.exe -Name Application-AS-a-Service -Credential $credentials -DisplayName "Application as a service" -StartupType Automatic 

Tada! You’ve now installed the service. Remember that if the service needs access to a network share or other resources to assign the correct user to the service.

Creating a Poor Man’s Express Route

Lately we’ve been running into issues with client’s connecting to Office 365 and having high latency with this connection. this results in issues such as “Outlook is not Responding” & “Getting data from server outlook.Office365.com”. Both of these issues disappear when running in Cached mode due to less data-traffic being required for normal operations such as opening e-mails and retrieving folder lists.

After researching we’ve found that this is mostly related to enormous detours that the ISP was taking to reach the Office 365 back-end. Mostly due to “cheaper” connections to the DUB(EU) or AMS(EU) data-centers of Microsoft. A good example of this can be seen with the Dutch ISP RoutIT, where you have 28 hops on average to the Office 365 back-end(of which 90% is still in their network), but only 5 hops to the Azure back-end strangely enough. After contacting Microsoft Support we’ve got the dreaded answer: Please invest into an Express-Route solution to circumvent these issues or have the provider resolve the connection problems.

We have multiple small sites with 5+ users that simply cannot afford this, and the clients can’t accept this latency. So we decided to ignore Microsoft’s advice and go on our own path of discovery ๐Ÿ˜‰ After reading some of the service descriptions we’ve found that Microsoft assures better connections from Azure to Office 365 by using extensive QoS, and of course simply because the virtual machines are in the same network as the Office 365 services.

I’ve created the step-by-step below to research, test, and implement a solution.

Research

first we needed to discover if the client actually suffered from this issue:

  • Open outlook on a user with connection issues.
  • Right-click the tray icon, while holding CTRL and click on “Connection Status”
  • In online mode you’d want the AVG. Proc time to be lower than 25(Cached mode this can be anything from 25-300)
  • In online mode you’d want the AVG RTT Time to be lower than 200(Cached mode this can be anything from 0-1000)
  • If you have results higher than above, you are possibly having issues connecting to outlook.office365.com and a poor-man’s ExpressRoute might help you run your outlook in Online mode.

After these tests, we run a Traceroute to tripple-check if the route is the issue, or if the connection latency itself is the issue. You expect a traceroute to Outlook.office365.com to always be less than 20 hops, preferably in the 10-15 range, such as my results;

tracert outlook.office365.com

Tracing route to outlook-emeacenter.office365.com [132.245.229.114]
over a maximum of 30 hops:

  1     1 ms    <1 ms     1 ms  xxx.xxx.xxx.xxx
  2     9 ms    9 ms    10 ms  xxx.xxx.xxx.xxx
  3     7 ms     8 ms     7 ms  ae0-33.crt02-gsw.redhosting.nl [178.238.96.45]
  4     4 ms    4 ms    12 ms  ae0-31.crt01-nkf.redhosting.nl [178.238.96.37]
  5     4 ms    4 ms    13 ms  ams-ix-2.microsoft.com [80.249.209.21]
  6     6 ms    7 ms    14 ms  be-62-0.ibr01.amb.ntwk.msn.net [104.44.9.134]
  7     13 ms    12 ms    15 ms  be-7-0.ibr01.ams.ntwk.msn.net [104.44.5.33]
  8     14 ms    16 ms    16 ms  ae63-0.am3-96c-1a.ntwk.msn.net [104.44.5.1]
  9     6 ms     5 ms     5 ms  132.245.229.114

Trace complete.

If you have results as above, this solution might not be the one for you, Of course you can continue testing below but YMMV.

Create Virtual Machine Azure for performance tests & RRAS

Create a virtual machine in the Azure portal with the following specifications and configuration, If you have never created a virtual machine please use the ย following guide on the Microsoft website. The interfaces changes every couple of months so its best to use the resources from Microsoft themselves.

Configuration:

  • at least 1 Core
  • at least 1.75GB of RAM
  • Static Internal IP for the server
  • Allow traffic: SSL to server(443)
  • Install the following roles: Remote Access, DHCP Server.
  • Setup DHCP and Remote Access.

After configuring the Azure side of the requirements make sure you configure SSTP VPN connections(Not required – For testing only.) and allow Router advertisement and IP forwarding. If you require a SSL Certificate you canย use a free certificate from LetsEncrypt for this test.

After performing all configurations connect to the SSTP VPN from a client that experienced issues. Remember to set the “Use default gateway on remote network” to force all traffic over this tunnel for the sake of the test. Restart outlook and perform the testing steps once more. You should see a lower AVG. Proc Time and lower AVG. RTT Time. Have the user work on the SSTP connection for a day to see if they actually see a noticeable difference. If this is the case, its time for a more permanent solution.

Create IPSec tunnel

If the tests above have been performed, and you are ready for a permanent solution you can use Azure to create an ipsec tunnel, for that I advise the following guide.

After configuration of your ipsec tunnel, remember to tunnel all Office 365 traffic using the Microsoft IP address list found on this link. You can choose what traffic you route over the tunnel and what traffic you’d prefer using your local internet connection. When using the IPSEC tunnel remember that you must use your RRAS server as the gateway or else the routing will end on the side of the the Azure S2S VPN.

an ipsec VPN Tunnel and RRAS for NAT/Routing costs about 90% less than ExpressRoute, and might resolve most issue you have with sluggish outlooks, or slow responding Office 365 services.

Happy O365ing.

Using PowerShell to monitor Backups

We are using a RMM that has integrated BackupExec monitoring. I’ve found that this integrated monitoring was somewhat lacking. It gave us the current job status and that’s about it, Meaning there was not really a way to resolve issues pre-preemptively.

To resolve this we’ve created the following monitoring PowerShell script and integrated it into our RMM solution. For your convenience we’ll dissect the script so you can re-use this in your own solution or set it as scheduled job ๐Ÿ™‚

We’ll start by importing the BEMCLI module which is included in BackupExec 2012 and up.

#Setting default CLI
import-module bemcliย 

After importing the BEMCLI We’ll be able to use the cmdlets that get us the info we want, in this case “get-BEAlert” which gives us the current Alerts that have not been acknowledged within BackupExec

#Getting Alerts
$Alerts = Get-BEAlert

Now that we have the Alerts stored in a variable we wil loop through the contents of the alerts to find what we require;

#Looping through the alerts and writing the message to the console
foreach($Alert in $Alerts){
write-host $Alert.message
}

Now that we see that the messages are posted in the console we’d of course like a better overview.

#Looping through the alerts and setting them.
foreach($Alert in $Alerts){

switch ($Alert.Category)
{
JobWarning{$JobWarning = "TRUE - $($Alert.Message)" }
JobFailure{$JobFailure = "TRUE - $($Alert.Message)"}
JobCancellation{$JobCancellation = "TRUE - $($Alert.Message)" }
CatalogError{$CatalogError = "TRUE - $($Alert.Message)"}
SoftwareUpdateWarning{$SoftwareUpdateWarning = "TRUE - $($Alert.Message)"}
SoftwareUpdateError{$SoftwareUpdateError = "TRUE - $($Alert.Message)" }
DatabaseMaintenanceFailure{$DatabaseMaintenanceFailure = "TRUE - $($Alert.Message)"}
IdrCopyFailed{$IdrCopyFailed = "TRUE - $($Alert.Message)"}
BackupJobContainsNoData{$BackupJobContainsNoData = "TRUE - $($Alert.Message)" }
JobCompletedWithExceptions{$JobCompletedWithExceptions = "TRUE - $($Alert.Message)"}
JobStart{$JobStart = "TRUE - $($Alert.Message)"}
ServiceStart{$ServiceStart = "TRUE - $($Alert.Message)"}
ServiceStop{$ServiceStop = "TRUE - $($Alert.Message)"}
DeviceError{$DeviceError = "TRUE - $($Alert.Message)"}
DeviceWarning{$DeviceWarning = "TRUE - $($Alert.Message)"}
DeviceIntervention{$DeviceIntervention = "TRUE - $($Alert.Message)"}
MediaError{$MediaError = "TRUE - $($Alert.Message)"}
MediaWarning{$MediaWarning = "TRUE - $($Alert.Message)" }
MediaIntervention{$MediaIntervention = "TRUE - $($Alert.Message)"}
MediaInsert{$MediaInsert = "TRUE - $($Alert.Message)"}
MediaOverwrite{$MediaOverwrite = "TRUE - $($Alert.Message)"}
MediaRemove{$MediaRemove = "TRUE - $($Alert.Message)"}
LibraryInsert{$LibraryInsert = "TRUE - $($Alert.Message)"}
TapeAlertWarning{$TapeAlertWarning = "TRUE - $($Alert.Message)"}
TapeAlertError{$TapeAlertError = "TRUE - $($Alert.Message)" }
IdrFullBackupSuccessWarning{$IdrFullBackupSuccessWarning = "TRUE - $($Alert.Message)"}
LicenseAndMaintenanceWarning{$LicenseAndMaintenanceWarning = "TRUE - $($Alert.Message)"}
default{$OtherErr = "TRUE - $($Alert.Message)" }
}
}

Now if we put this all together the result would be;

&lt;# .SYNOPSIS Gets BackupExec information and reports on Running Alerts - Only works on BackupExec 2012 and higher. .DESCRIPTION Using BEMCLI we retrieve data from BACKUPEXEC, including multiple types of alerts, LastRunTime, etc. Currently Alerts are generated for the following catagories: JobWarning JobFailure JobCancellation CatalogError SoftwareUpdateInformation SoftwareUpdateWarning SoftwareUpdateError DatabaseMaintenanceFailure IdrCopyFailed IdrFullBackupSuccess BackupJobContainsNoData JobCompletedWithExceptions JobStart ServiceStart ServiceStop DeviceError DeviceWarning DeviceIntervention MediaError MediaWarning MediaIntervention MediaInsert MediaOverwrite MediaRemove LibraryInsert TapeAlertWarning TapeAlertError IdrFullBackupSuccessWarning LicenseAndMaintenanceWarning .LINK http://www.cyberdrain.com #&gt;

#Setting default CLI
i#Setting default CLI
import-module bemcliย 
#Getting Alerts
$Alerts = Get-BEAlert

#Looping through the alerts and setting them.
foreach($Alert in $Alerts){

switch ($Alert.Category)
{
JobWarning{$JobWarning = "TRUE - $($Alert.Message)" }
JobFailure{$JobFailure = "TRUE - $($Alert.Message)"}
JobCancellation{$JobCancellation = "TRUE - $($Alert.Message)" }
CatalogError{$CatalogError = "TRUE - $($Alert.Message)"}
SoftwareUpdateWarning{$SoftwareUpdateWarning = "TRUE - $($Alert.Message)"}
SoftwareUpdateError{$SoftwareUpdateError = "TRUE - $($Alert.Message)" }
DatabaseMaintenanceFailure{$DatabaseMaintenanceFailure = "TRUE - $($Alert.Message)"}
IdrCopyFailed{$IdrCopyFailed = "TRUE - $($Alert.Message)"}
BackupJobContainsNoData{$BackupJobContainsNoData = "TRUE - $($Alert.Message)" }
JobCompletedWithExceptions{$JobCompletedWithExceptions = "TRUE - $($Alert.Message)"}
JobStart{$JobStart = "TRUE - $($Alert.Message)"}
ServiceStart{$ServiceStart = "TRUE - $($Alert.Message)"}
ServiceStop{$ServiceStop = "TRUE - $($Alert.Message)"}
DeviceError{$DeviceError = "TRUE - $($Alert.Message)"}
DeviceWarning{$DeviceWarning = "TRUE - $($Alert.Message)"}
DeviceIntervention{$DeviceIntervention = "TRUE - $($Alert.Message)"}
MediaError{$MediaError = "TRUE - $($Alert.Message)"}
MediaWarning{$MediaWarning = "TRUE - $($Alert.Message)" }
MediaIntervention{$MediaIntervention = "TRUE - $($Alert.Message)"}
MediaInsert{$MediaInsert = "TRUE - $($Alert.Message)"}
MediaOverwrite{$MediaOverwrite = "TRUE - $($Alert.Message)"}
MediaRemove{$MediaRemove = "TRUE - $($Alert.Message)"}
LibraryInsert{$LibraryInsert = "TRUE - $($Alert.Message)"}
TapeAlertWarning{$TapeAlertWarning = "TRUE - $($Alert.Message)"}
TapeAlertError{$TapeAlertError = "TRUE - $($Alert.Message)" }
IdrFullBackupSuccessWarning{$IdrFullBackupSuccessWarning = "TRUE - $($Alert.Message)"}
LicenseAndMaintenanceWarning{$LicenseAndMaintenanceWarning = "TRUE - $($Alert.Message)"}
default{$OtherErr = "TRUE - $($Alert.Message)" }
}
}

Now you can schedulde this script in your own RMM or sent e-mails based on the result.:) Happy Scripting!