This post has been republished via RSS; it originally appeared at: Core Infrastructure and Security Blog articles.
Episode Story Line:
A mild mannered IT administrator is doing routine patching when things go south. At first it seems like maybe just a case of the wrong patch for the wrong system but there is more to this story. Time to put our detective skills to use and find out what’s going on.
Any patch installation investigation failure should start with the basics, aka the Application/System/Security event logs. In this case, the normal logs simply re-state the error message displayed on screen so no help there…it looks like we’re already past the basics.
If you have ever performed troubleshooting on in-depth patch installation, the CBS log (c:\Windows\Logs\CBS\CBS.log) is your go to source and contains a wealth of information on the installation process, some would say too much information. But that’s why we’re here today, to help find the clues to solve the crime.
In this case, if you go directly to the end of the CBS log after attempting to install the patch, you should see the tail end of the process. We don’t have to look too far before seeing signs of a problem: ERROR_NOT_SAME_DEVICE… what’s that all about? \
One trick I use in sifting through the large amount of data in the CBS.log is to search for key words, like “ERROR” or “FAIL”, but in the clues we are searching for can quickly be found by searching for the term “CSI” (see where the title of this post comes from?). CSI stands for Component Servicing Infrastructure and is responsible for putting the patch files on the system. If you are getting errors due to permissions or possibly from antivirus products, this is the category you want to look for. Don’t forget to search in the “Up” direction…gets me every time. Note the line below found while searching the CSI messages. The reference to Volume serial number as well as the name of the target file, which references a group policy administrative template, were instrumental fingerprints in this case.
Now, let’s go take a look at that location to see if anything looks wonky. The first thing we notice is that there is a PolicyDefinitions_old folder, which appears to be a backup of the unchanged folder…at least our suspect is cautious. Notice the “Shortcut” icon on the new PolicyDefinitions folder. Things are starting to heat up…getting warmer.
To get more information on the folder, let’s go to PowerShell and take a look at the object with Get-Item C:\Windows\PolicyDefinitions | fl *. While the FullName field looks correct, the Target field references another volume entirely. This is what is causing the patch installation to fail. After talking to the server owner, it came to light that a GPO management tool was installed on the system and the link was created to assist the product manage the admin templates. The actual software had been installed on a different volume, hence the Junction point going to a different drive.
The fix for this situation was to remove the Junction point and re-create the PolicyDefinitions folder so that the patch could install the new templates. Since the Junction was a requirement, we had to copy the new admin templates to the target folder and then restore that junction point temporarily but are actively working to come up with a solution that does not trigger the different volume problem.
Another tough issue put to bed but I’m sure many more will follow.