script

Automate your SUS clean up

Automate your SUS clean up

Many companies rely on WSUS respective SUS services from Microsoft – aka. Windows Server Update Services as internal source and control of their update deployment to clients and servers within their network.

One of the big challenges for IT is to keep them clean and performant. The Cleanup-Assistant in the SUS management console tends to run forever and in any case means manual labor over and over again.

Below are two scripts – a CMD script that needs to be adjusted with parameters and a powershell script that will be called with those parameters. The scripts acutally will call the same API as the MMC Assistant does, just that this can be automatically performed via a scheduled task in Windows.

It helps you to keep your SUS slim and more performant.

In any way – I highly recommend to not blindly just enable all categories rather then limiting it to the once you have in place as well as once you reached a certain patch-level even actively denying updates you never will need again (keep in mind, new rolled out systems might still need older updates – but you could possibly refresh your base images or rely on Microsoft update services / online updates for those cases).

The combination of making updates obsolete and actually running a cleanup periodically will improve your SUS server performance.

As for the parameters, those are explained in the CMD script header – therefor I will not explain them here again.

Script to remove RemoteApp and Desktop Connections

Script to remove RemoteApp and Desktop Connections

RADC or RemoteApp and Desktop Connections are very powerful in combination with Windows 7 or newer. You actually can have Terminalserver or RDS / Remote Desktop Server applications in the users start menu and connect to them in seamless window applications.

Windows 7 made it challenging to even implement those applications in a large scale, for this sole purpose you had to use a PowerShell script that actually imported a WCX file. Windows 8 and especially Windows 10 can do this via GPO nowadays.

The GPO settings allow one RDS farm to be added and they of course will remove the RDS farm if the GPO is changed/removed.

But what about those Windows 7 clients that are still out there and those cases where you actually have other RDS / RADC connections that you want to delete, e.g. manually created ones. I just came across this scenario and wanted to share the script I just wrote. I created two files in order to executed it simply via GPO as a Cscsript in order to avoid any dialog boxes coming up.

The .CMD executes the .VBS an expects it in the same directory of course. In the .VBS you will need to change the 5th line – as inidicated. Everything else you can leave as is. Of course this script will only delete the specified connection. You could define the line 5 parameter and change line 33 from

to the following line

This would result in to deleted everything but the defined connection and therefor do a cleanup. In theory you could then put a empty string in line 5 and just clean up everything.

As always, I hope some of you find this helpful.

Script based SQL Express backups

Script based SQL Express backups

SQL Express is widely used but has huge downside, there is no SQL Agent available. Even Windows internal databases, especially WSUS / Windows Update Services / Microsoft Update Services are in the end SQL Express like databases that do not have a SQL Agent.

Now, you can have central SQL Servers with Agents have them backed up – and I recommend on doing so if possible. But for the many times this is not possible, you will need to find another way to create those nice little .BAK files for SQL internal backups aka. SQL Maintenance Plan Backups. To work around this issue, I once wrote a script that automates this for each database found on a specific SQL server. It creates the backups via SQLCMD commands and even does a clean up of obsolete files (files older than x days), almost like SQL Maintenance Plans do it.

The script is divided in to a .CMD file that executes the actual backup and where you set the configuration/parameters and a .VBS file that is controlled by the actual .CMD script and will perform the backup cleanup. In the end you can have the .CMD send a email report – I used the SMTPSEND program from Michael Kocum (https://www.dataenter.com/download.asp) for this since I already ad it flying around – you could replace the mail send option with another prepared SMTPSEND client, a VBS script or just remove it completely.

Adjusting the settings / parameter:

This is all done in the SQLBACKUP.CMD file – the header section pretty much will explain all you need to know, from SQL Server to SQL-User and Password over Mail-Server to recipients.

If you want to execute the SQLBackups as the Windows-User that is executing the script, you need to exchange the REM (remarks) for the following two lines further down in the scripts. I apologize for the inconvenience, this is a old script I never updated to have those settings in the header (more automated), I always just changed the lines.

Everything else should be rather easy. Of course you will need sufficient access rights to the SQL databases and your destination backup folder. The task-scheduler might work best if you execute the script with “cmd /c c:\scripts\sqlbackup.cmd” (change the path as you need it) and set the working directory / startup folder right. It might help to execute the task with elevated rights etc. – all depending on your systems configuration.

Below are the two scripts – I hope this helps some of you. The generated .BAK files can simply be restored in SQL services via the GUI cause they are native SQL backup files.

SQL Database backup monitoring

SQL Database backup monitoring

The following article was originally posted here by myself: https://kb.paessler.com/en/topic/79665-sql-database-backup-monitoring

SQL backups and their monitoring is one of the most important things. We often talk about rather complex situations including transaction-logfiles and various other stuff.

Monitoring those things was something that did cost us to many sensors with the standard-scripts etc. and was not really effective.

In order to change this – here are two scripts that will be able to solve most of your issues – you find them both at the end of this posting:

  • SQLBackupDestinationCheck.vbs
  • SQL_Database_Full_Backups.sql

SQLBackupDestinationCheck.vbs: This is a VBS script that will return XML content to Paessler/PRTG in multiple channels while using one sensor.

It expects three parameters:

  • go through first level sub folders: 0 (no) or 1 (yes)
  • file extension to obey – any other extension will be ignored – in most cases: “bak”
  • Path – should mostly be an UNC path

It will return those channels:

  • Total file count: count of all files with this extension in all folders checked
  • Total folder count: count of all folders checked
  • Oldest file found in days: oldest file – value gives back age in days
  • Newest file found in days: newest file – value gives back age in days
  • Lowest files in folder count found: lowest count of files that have been found in one folder
  • Highest files in folder count found: max. files that have been found in one folder

This needs some explanation:
The script checks a path for files with a certain extension. Let’s say you do SQL maintenance plans and use the extension .BAK to write those. You do a daily backup and keep them for 3 days to make sure they end up on a tape, further do you use sub-folders per database and you have a total of 5 databases on this system – now you will need to configure error-limits per channel – e.g.:

  • Total file count: lower limit: 10 files – upper limit: 20 files – during the backup you might have up to 20 files
  • Total folder count: 5 folders upper and lower limit – more/less then 5 would mean something changed
  • Oldest file found in days: lower limit 2 days – upper limit 4 days – older then 4 would mean the cleanup does not work
  • Newest file found in days: lower limit 0 days – upper limit 2 days – nothing newer (date issues? and nothing older as well)
  • Lowest files in folder count found: lower limit: 2 – there should be always more then 2x .BAK files in any subfolder
  • Highest files in folder count found: upper limit: 4 – anything above again would mean some clean up is not working right for one database

So – keep in mind – you can get more fancy with WARNING limits and ERROR limits – the example above will help you understand what to do – this should help you getting started. The script will save you quite a few sensors and still keep a pretty close watch on the file-system side of SQL backups – of course you could abuse it for something else then SQL backups as well – but this was my main intent for this script.
SQL_Database_Full_Backups.sql
This file will request information about backups for SQL itself. It might need a SQL 2005 or newer to work – and yes – I did post this on another PRTG KB thread – but I wanted to have the complete solution in this one post.

The script will be executed against the SQL server instance the databases reside, on the master-database. You need to specify a parameter that will be set as “@MaxHours” – this actually should be something like 26 hours, so your SQL backups are never exciting 26 hours – giving the backup some time to run as well. More for bigger databases might be necessary. If you do multiple full-backups per day, set it to e.g. 2 hours or what ever your limit is.

You will get backup 3x columns:

TotalAmountOfDatabasesTotal amount of databases of this server – this allows you not only to watch if anyone created/deleted a database on the server, it also gives you a good base-line in general
RecentlyBackupUpCounHow many databases have been backed up recently – full backup – in specified time-window
NOTRecentlyBackupUpCountHow many have not been backed up in the same time window

RecentlyBackupUpCount and NOTRecentlyBackupUpCount should always match up to TotalAmountOfDatabases – but that’s not the point. More important is – you might have backed up databases and not backed up databases – set you error-limits for all three columns accordingly – upper and lower limit – and you will see that the alert will fire if you add a database or keep the SQL agent service stopped so it hops over a single backup and misses it…

Folder: C:\Program Files (x86)\PRTG Network Monitor\Custom Sensors\EXEXML
File: SQLBackupDestinationCheck.vbs

Folder: C:\Program Files (x86)\PRTG Network Monitor\Custom Sensors\sql\mssql
File: SQL_Database_Full_Backups.sql