Monday, October 31, 2016

How to Stop Windows 10 Domain Computers reporting "Disable apps to help improve performance"

Create or modify a Group Policy Object that applies to the target computers.

Under Computer Configuration\Policies\Windows Settings\Scripts\Startup create a Powershell Script entry named "DisableStartupAppTask.ps1"

In the script, have the single line of code:

Disable-ScheduledTask -TaskName '\Microsoft\Windows\Application Experience\StartupAppTask'



Wednesday, June 3, 2015

sFlow: Sampling rates

From http://blog.sflow.com/2009/06/sampling-rates.html:



Sampling rates


A previous posting discussed the scalability and accuracy of packet sampling and the advantages of packet sampling for network-wide visibility.

Selecting a suitable packet sampling rate is an important part of configuring sFlow on a switch. The table gives suggested values that should work well for general traffic monitoring in most networks. However, if traffic levels are unusually high the sampling rate may be decreased (e.g. use 1 in 5000 instead of 1 in 2000 for 10Gb/s links).

Configure sFlow monitoring on all interfaces on the switch for full visibility. Packet sampling is implemented in hardware so all the interfaces can be monitored with very little overhead.

Finally, select a suitable counter polling interval so that link utilizations can be accurately tracked. Generally the polling interval should be set to export counters at least twice as often as the data will be reported (see Nyquist-Shannon sampling theory for an explanation). For example, to trend utilization with minute granularity, select a polling interval of between 20 and 30 seconds. Don't be concerned about setting relatively short polling intervals; counter polling with sFlow is very efficient, allowing more frequent polling with less overhead than is possible with SNMP.

Sunday, March 1, 2015

How to Convert a PFX (PKCS#12) SSL Certificate to Separate KEY and CRT Files


I've had to look this up a number of times, so I'm posting it here for posterity.

source: http://www.markbrilman.nl/2011/08/howto-convert-a-pfx-to-a-seperate-key-crt-file/

`openssl pkcs12 -in [yourfile.pfx] -nocerts -out [keyfile-encrypted.key]`

What this command does is extract the private key from the .pfx file. Once entered you need to type in the importpassword of the .pfx file.  This is the password that you used to protect your keypair when you created your .pfx file.  If you cannot remember it anymore you can just throw your .pfx file away, cause you won’t be able to import it again, anywhere!.  Once you entered the import password OpenSSL requests you to type in another password, twice!. This new password will protect your .key file.

Now let’s extract the certificate:

`openssl pkcs12 -in [yourfile.pfx] -clcerts -nokeys -out [certificate.crt]`

Just press enter and your certificate appears.

Now as I mentioned in the intro of this article you sometimes need to have an unencrypted .key file to import on some devices.  I probably don’t need to mention that you should be carefully. If you store your unencrypted keypair somewhere on an unsafe location anyone can have a go with it and impersonate for instance a website or a person of your company.  So always be extra careful when it comes to private keys! Just throw the unencrypted keyfile away when you’re done with it, saving just the encrypted one.

The command:

`openssl rsa -in [keyfile-encrypted.key] -out [keyfile-decrypted.key]`

Notes:
- When you first extract the key, apply a new password (probably the same as you used to extract it) and then create an unencrypted key with the rsa command above
- Use an encrypted key file for NGINX otherwise it'll ask for the password every time it is restarted.
- Check the top of the extract .crt file for extra bits above the ----BEING... line and remove if necessary
- This certificated needs to be concatenated with the full chain of certificate authorities `cat domain.crt CA_bundle.crt > final.crt`
- test the cert with `openssl s_client -showcerts -connect www.domain.com:443`

Addendum:

To convert a PFX file to a PEM files:

`openssl pkcs12 -in [yourfile.pfx] -out [certificate.pem] -clcerts`

`openssl pkcs12 -in [yourfile.pfx] -out [cacerts.pem] -cacerts`

To convert a PFX file to a combined PEM file in one step AND remove encryption:

'openssl pkcs12 -in [yourfile.pfx] -out [decrypted.pem] -nodes'


Tuesday, November 25, 2014

Enforce Google Safe Search

So Google is no longer going to permit the nossl DNS trick that previously allowed organisations to disable SSL for searches so that Safe Search could be enforced.

Google Online Security Blog: An update to SafeSearch options for network administrators

The option that they are now permitting is a DNS trick to point users to forcesafesearch.google.com which will still be SSL enabled, but will not allow the user to disable Safe Search.

The only way to ensure this for all Google search engines is to create a DNS zone for each of Googles search domains.... all 193 or so.

Microsoft doesn't let you create a CNAME entry for the parent zone, but it does allow you to create a DNAME entry, so I came up with this script to create all of the zones.

The script, the google.txt file and some basic instructions can be found here.

(I added the length check because the original text file had some carriage returns at the end.)

As always, no responsibility is accepted for its use.

 param([string]$inputfile="google.txt")  
 #Check for the Input file  
 $FileCheck = Test-Path $inputfile  
 if ($FileCheck -eq "True")  
      {  
      write-output "Input file located"  
      }  
 else  
      {  
      write-output "Please supply file containing google zone list"  
      exit  
      }  
 #Process each line in the Input file and create a zone and DNAME record  
 foreach ($zone in Get-Content $inputfile)  
      {  
      $count=$count+1  
      $len = $zone.length -as [int]  
      if ($len -gt 5)  
           {  
           $zone="www"+$zone  
           write-output "Processing entry $($count). Creating zone for $($zone)"  
           dnscmd /zoneadd $zone /dsprimary  
           write-output "Processing entry $($count).Creating DNAME entry for $($zone)"  
           dnscmd /recordadd $zone "@" DNAME forcesafesearch.google.com  
           }  
           else  
           {  
           write-output "Zone data for entry $($count) too short. Not processing."  
           }  
      }  

Resize User Photos and Import them into Active Directory Accounts


Resize User Photos and Import them into Active Directory Accounts using PowerShell and ImageMagick.

This script looks in a specified path for photos named with the EmployeeID attribute of the users in a specified OU, resizes the images to the correct size and then writes the images into the thubnailPhoto attribute of the users Active Directory account.

As always, no responsibility is accepted for it's use.

 param([string]$searchbase , [string]$imagepath)  
 #Import the ActiveDirectory PowerShell module  
 import-module ActiveDirectory  
 #Check for Mandatory Parameters  
 if (!$searchbase)  
      {  
      write-output 'Usage: ADImages {searchbase} {imagepath}'  
      write-output 'eg. ADImages "OU=Staff,OU=Users,DC=orgname,DC=com,DC=au" \\fileserver\Userimages'  
      exit  
      }  
 if (!$imagepath)  
      {  
      write-output 'Usage: ADImages {searchbase} {imagepath}'  
      write-output 'eg. ADImages "OU=Staff,OU=Users,DC=orgname,DC=com,DC=au" \\fileserver\Userimages'  
      exit  
      }  
 #Check if the Searchbase exists  
 $OUCheck = [adsi]::Exists("LDAP://$($searchbase)")  
 if ($OUCheck -eq "True")   
      {  
      write-output "Found Searchbase $($searchbase)"  
      }  
 else  
      {  
      write-output "Searchbase $($searchbase) not found"  
      exit  
      }  
 #Check that the Image Path exists  
 $ImageCheck = Test-Path $imagepath  
 if ($ImageCheck -eq "True")  
      {  
      write-output "Found Image Path $($imagepath)"  
      }  
 else  
      {  
      write-output "Image Path $($imagepath) not found"  
      exit  
      }  
 #Check for the ImageMagick Conversion Tool  
 $ToolCheck = Test-Path ".\ImageMagick\convert.exe"  
 if ($ToolCheck -eq "True")  
      {  
      write-output "ImageMagick tool found"  
      }  
 else  
      {  
      write-output "ImageMagick tool not found. Download from http://www.imagemagick.org/"  
      exit  
      }  
 #Create the Thumbnail directory if it doesn't exist  
 $DirCheck = Test-Path ".\ADThumbs"  
 if ($DirCheck -eq "True")  
      {  
      write-output "Thumbnail directory already exists"  
      }  
 else  
      {  
      write-output "Creating Thumbnail directory"  
      New-Item -ItemType directory -Path .\ADThumbs  
      }  
 #Get an array of users from the Searchbase  
 $UserList = Get-ADUser -Filter * -SearchBase $searchbase  
 Foreach ($User in $UserList)  
      {  
      #Get the EmployeeID Attribute  
      $EmpID = Get-ADUser -Filter * -SearchBase $User -Properties employeeID | select -expand employeeID  
      write-host "Looking for Employee Photo for User $($User) with ID $($EmpID)"  
      #Tests to see if the UserImages file exists  
      $FileCheck = Test-Path "$($imagepath)\$($EmpID).jpg"  
      if ($FileCheck -eq "True")   
           {  
           #Retrieves JPG files of the target user from the UserImages share  
           $jpgfile = "$($imagepath)\$($EmpID).jpg"  
           $newjpgfileName = ".\ADThumbs\$($EmpID)-AD.jpg"  
           write-output "Scaling $($jpgfile) to $($newjpgfileName)"  
           .\ImageMagick\convert $jpgfile -thumbnail 96 -gravity center -crop 96x96+0-15 +repage -strip $newjpgfileName   
           #Write the thumbnail photo back to the AD user Account  
           $photo = [byte[]](Get-Content $newjpgfileName -Encoding byte)  
           Set-ADUser $User -Replace @{thumbnailPhoto=$photo}  
           }  
      else  
           {  
           #User Image file not found  
           write-output "Employee ID $($EmpID) not found in $($imagepath)"  
           }  
      }  

Monday, February 10, 2014

File Path manipulation in Excel

Saw this over at stackoverflow. Had to make a note of it for future reference.

http://stackoverflow.com/questions/18617349/excel-last-character-string-match-in-a-string


Let's say for example you want the right-most \ in the following string (which is stored in cell A1):
Drive:\Folder\SubFolder\Filename.ext
To get the position of the last \, you would use this formula:
=FIND("@",SUBSTITUTE(A1,"\","@",(LEN(A1)-LEN(SUBSTITUTE(A1,"\","")))/LEN("\")))
That tells us the right-most \ is at character 24. It does this by looking for "@" and substituting the very last "\" with an "@". It determines the last one by using
(len(string)-len(substitute(string, substring, "")))\len(substring)
In this scenario, the substring is simply "\" which has a length of 1, so you could leave off the division at the end and just use:
=FIND("@",SUBSTITUTE(A1,"\","@",LEN(A1)-LEN(SUBSTITUTE(A1,"\",""))))
Now we can use that to get the folder path:
=LEFT(A1,FIND("@",SUBSTITUTE(A1,"\","@",LEN(A1)-LEN(SUBSTITUTE(A1,"\","")))))
Here's the folder path without the trailing \
=LEFT(A1,FIND("@",SUBSTITUTE(A1,"\","@",LEN(A1)-LEN(SUBSTITUTE(A1,"\",""))))-1)
And to get just the filename:
=MID(A1,FIND("@",SUBSTITUTE(A1,"\","@",LEN(A1)-LEN(SUBSTITUTE(A1,"\",""))))+1,99)
However, here is an alternate version of getting everything to the right of the last instance of a specific character. So using our same example, this would also return the file name:
=TRIM(RIGHT(SUBSTITUTE(A1,"\",REPT(" ",99)),99))

Sunday, May 19, 2013

Memory Leak in Windows 8 Network Data Usage Monitoring Driver

Just thought I'd share this experience I had over the weekend, as it may save someone else many hours of troubleshooting.

I've been tinkering around with Windows 8 at home, even though I know there's little likelihood that we'll implement it at work any time soon.

While using my Windows 8 machine to copy a large amount of files from my NAS to a USB drive, I was experiencing lock-ups of my system. It wasn't a complete crash. The system would just become extremely unresponsive.

It soon became apparent that something was leaking memory. I was seeing the amount of memory being consumed skyrocket up to 100%, at which point the copy process would crash and system would stop responding politely. The task manager and performance monitor were not attributing the memory to any process however.

I tried using robocopy instead of Explorer copy. Same thing.

I tried updating the Realtek network driver, USB 3 driver and even the ASUS BIOS, (as they were all a few versions behind). Same thing.

I was getting to the point where I was figuratively scratching my head, so I tried booting into safe mode with networking. Aha! The memory usage stayed consistent and the copy performed just fine!

There are a number of network related drivers that safe mode don't load. DriverView showed that one of them is the Windows Network Data Usage Monitoring Driver ndu.sys that was introduced in Windows 8 and provides "network data usage monitoring functionality".

Disabling this driver by changing the start value to 4 in HKLM\SYSTEM\CurrentControlSet\Services\Ndu 
solved the problem.

Maybe this will be fixed when Microsoft releases Blue.