Closing the doors – Standalone User Accounts

I’ve been involved with a lot of Cyber Security projects recently and it got me thinking about all the protection that is available in the modern OS that people simply don’t enable. Having a mechanism to secure you and not enabling it is the same as having a door to your home that you leave open for all to walk through, day or night, whether you know them or not. Sounds daft right…

In this “close the door” series I’ll be taking you through some easy to implement security to help protect yourself and your business against cyber attacks. Follow the steps to get protected!

In this episode we look at user accounts on standalone Windows PC’s.



Steps in Summary:

  1.  Setup PC
  2. Create another user account – set permissions to Administrator
  3. Logged in as the newly created user account change the original user account to Standard user
  4. Reboot



Powershell – Update AD HomeDirectories

Powershell is great! Especially if you need to carry out bulk operations on a large scale. Here is a useful script I’ve just used to update the Active directory HomeDirectory and HomeDrive settings for all users with an AD.

$users = get-aduser -filter *

foreach ($user in $users) {Set-ADUser $user -Clear HomeDirectory,HomeDrive}

Breaking it down the “Get-ADUser” command is being used to populate the variable $users which contains a list of users. We then jump into a loop which runs the “Set-ADUser” command against the currently selected item until the end of the $users list is reached.

As with everything, be careful before applying and ensure you know what you’re doing first. If in doubt let me know.

Windows Server 2016 Common Issues

So you’ve gone an installed Windows Server 2016 and almost instantly you’re receiving error messages in the event logs and Services warnings in the Local Manager…. do not panic. Continue reading…

Note: This will be a continually evolving article, if you feel anything is missing let me know so it can be added.


DistributedCOM – Event 10016

Something that throws an error on almost every Windows Server 2016 installation I’ve done is this one.

DCOM event 10016

Windows Server 2016 DistributedCOM error


The processes mentioned don’t have permissions to the DCOM components mentioned. Despite this being logged as an “error” (something you should be concerned about) in this case the advise from Microsoft is to ignore the event. There is no impact to the system.


Desktop Experience

With the GUI installed a lot of services that are part of a workstation OS (Windows 10) become enabled, these include services for syncing emails and photos etc across applications. As its a server you don’t need these, they can be disabled using this script from TechNet

Here is the TechNet guidance on services in Windows Server 2016 with Desktop Experience


CAPI-2 Error 513

This is something that appears when VSS related tasks are run and looks like this..


CAPI2 error event 513

The issue is caused because the VSS System writer (the writer being triggered) doesn’t have permission to read some of the files needed to perform a full system backup. To resolve:

  1. Open notepad
  2. Open an Administrative Control panel
  3. Use the command “sc sdshow mslldp” in the cmd window
  4. Copy its output into notepad
  5. Append the value (A;;CCLCSWLOCRRC;;;SU)
  6. Copy contents of notepad to clipboard
  7. Use the command “sc sdset mslldp **paste** contents



NHS Ransomware Attack – Initial Report

12th May 2017 the day the NHS floundered as it struggles to cope with a Ransomware infection affecting 40 sites. In some locations hospitals and medical centres are unable to cope turning away patients and good luck to anyone attempting to access medical records. Initial sources say around 15:30 online resources were also affected.

You may be thinking this is some kind of state sponsored attack, unfortunately it appears not. This Ransomware attack looks to be the result of an opportunistic infection designed to encrypt files on machines before holding to ransom, anyone with an infected machine sees this…


Based on some reports this is the result of a group piecing together known exploits which older OSes are vulnerable to.

With Microsoft XP (yes XP) running many services its not surprising that something as rudimental has caused issues.

My main questions is given that the NHS is so heavily reliant on their computer systems how has such an oversight been made with system protection and why was there no immediate business continuity failover plan in place. No doubt the NHS IT systems admins are going to bare the brunt of this but they shouldn’t, Business Continuity planning takes place at an organisational level with IT departments looking to assist the business plans using technological means.

So, what options are available to the NHS at this moment?

  1. Pay up – paying criminals (evidence suggests this has happened in some sites already) although there’s no guarantee that paying will result in being able to retrieve data.
  2. Data Restore and Rebuild – restore to the most recent version of data available and roll out new software images onto hardware. Note: if any NHS system admins are reading this for god sake DO NOT install Microsoft XP! Good luck though!

TechUG Recap -Birmingham

Today I attended my first TechUG (@TechUG on Twitter), a community aimed at IT professionals to be able to meet to discuss IT related topics. The sessions I caught today in Birmingham included speakers from CommVault, Veeam, Zerto and NimbleStorage, all reputable vendors and all very willing to share technical details and stories in an informal atmosphere. Following a quick Twitter exchange with the organisers I was invited to speak on the recent containerisation of the Gardner Systems Service Desk with Docker… I think it went down well. People were making notes anyway… (TechUG blog

Mr Fitz talking Docker TechUG November 2016.
Mr Fitz talking Docker TechUG November 2016.


Currently writing this blog post in an offline Google Docs due to poor wifi on this train. Impressed with the functionality of the app being that its offline and all.

What did I take away? The main thing for me today was the Veeam session and the new features of Veeam 9.5 presented by Geoff Buckingham (Veeam System Engineer). If you’re already a Veeam user upgrading is easy, download and install which is nice.

  • Veeam Direct restore into Microsoft Azure – this brings the ability to manage a restore of a machine into Microsoft Azure from the Veeam console. Useful for migrations and DR. (I feel a blog post coming on for this)
  • Backup I/O Control – monitors latency on production storage during backup process throttling to keep beneath specified thresholds. Backups can be performed during business hours without impacting business.
  • Veeam Backup for Office365 – Exactly what it says really.. Backup for Office 365, to onsite or cloud destination. Handy.

Zerto made a good presentation too around Ransomware and how their always on replication can help recover from such an attack. It was agreed over lunch that whilst having a valid recovery point was vital, in order to protect the business sufficiently a layered approach to security should be taken eg Varonis to produce early warning alert of encryption of the file system and Zerto for recovery or even in conjunction with Sophos Intercept X.

One discussion I had with an attendee, says “I can come here or do a training course. My boss doesn’t like training as it costs money”. For me, training is for skilling up when you know what path you’re going to take, events like TechUG help give you a view what the industry is doing and help you make decision of where you want to go. I really enjoy these types of events, being able to discuss technology with likeminded individuals is really valuable, discovering new ways of doing things and how the industry is changing.

Massive thanks to Mike England and Brendan Higgins for their work in organising the event. If you’re interested in attending check out the TechUG website for events near to you.

Using Docker Containers

Using Docker containers…  if you havn’t started yet with containers there are two ways to get started:

  1. Docker (download and install info here)
  2. Windows Containers (follow my blog post for setup here)

Presuming you have Docker all up and running lets run through the basics. Developing with Docker is fast(!!!), images are pulled from a central location (registry) making it easier to ensue developers within your team have the same starting configuration. No more complaints of “well it worked on my machine” or “it didnt do that when I tested it”, if it runs a particular way in docker it will always run that way regardless of platform.

The basics steps of Dev:

  1. Pull image from registry
  2. Customize to container to suit
  3. Push image back to registry to allow others to pull for dev / production

Stacking the Containers

If you havn’t been living under a rock recently you’ve no doubt heard a buzz around Containers. Don’t get me wrong, its not exactly new technology, Docker have been doing it since 2013 but with Microsoft having recently incorporated this technology into Windows Server 2016 and even added capability to Windows 10 this tech will certainly be adopted more.

If you’re like me you’re probably looking for something to help you get started… look no further.

Microsoft recently published this guide. One thing thats not mentioned is the updates needed, you must have Windows update KB KB3194496 installed otherwise things won’t work – Run Windows update. You’ll also need a processor with Intel VT-x (this feature is currently only available for Intel processors).

So with that in mind, all systems go!  Fire up Powershell as Administrator.

Enable-WindowsOptionalFeature -Online -FeatureName containers -All

Enable-WindowsOptionalFeature -Online -FeatureName Microsoft-Hyper-V -All

Reboot the machine and then set about getting docker installed.

Invoke-WebRequest "" -OutFile "$env:TEMP\" -UseBasicParsing

Expand the archive folder into ProgramFiles directory
Expand-Archive -Path "$env:TEMP\" -DestinationPath $env:ProgramFiles
Add Docker to the Path to allow commands to be resolved. Run both these commands.

# For quick use, does not require shell to be restarted. $env:path += ";c:\program files\docker" # For persistent use, will apply even after a reboot. [Environment]::SetEnvironmentVariable("Path", $env:Path + ";C:\Program Files\Docker", [EnvironmentVariableTarget]::Machine)

Then setup to start as a service. You’ll want to configure docker as a service using:
dockerd --register-service
Then start the service with:
Start-Service Docker

Right thats it… docker installed and ready to run my crane driving buddy! Check out this post (coming soon) for using Containers.

Mr Fitz