The answer depends on the organization and the applications involved. Many organizations don't have the luxury of a formal QA department. The administrators are the only people who have the resources to fully test the system.
Generally for patch testing, I also recommend using different policies for end-user systems and servers. You may also want to develop different policies for different servers. Generally, I recommend automatically pushing out patches to end-user systems. There is always the risk of some issues with the patch, however the clear of majority of the time, you are greatly decreasing the risks on the most abundant systems out there.
When servers are concerned, QA departments can often perform a more thorough testing of patches before they go live because they have standard procedures for making sure the current patch doesn't break anything, and for regression testing as well. Support departments usually have numerous responsibilities and are frequently overworked. I generally doubt that most support departments have formal testing procedures, but if they do, they have to interrupt a routine workflow to test a new patch. This is why I am very much in favor of scheduled patch releases by vendors, as it helps overworked support departments plan their time.
So all things being equal, if there are QA departments in place, they should be responsible for testing patches.
Dig Deeper on Cybersecurity risk assessment and management
Have a question for an expert?
Please add a title for your question
Get answers from a TechTarget expert on whatever's puzzling you.