In the last 16 years, Microsoft came a long way developing its management products. The first chapter in this story started with a solution System Management Server (SMS). It was a case of love or hate; very hard to configure and operate. The reason behind SMS being the first [Configuration] management tool seems to be, for the simple fact, that there were more client computers than servers. Very rapidly, it became very hard to maintain the configuration of 100 workstations. If you follow the 10:1 usual ration for workstations versus servers, taking care of 10 servers didn’t seem to be a big deal.
Around the year 2000, managing a relatively high number of servers had just started to become a problem. Companies grew considerably the number of microcomputers. Looking at multiple servers’ event logs was now a very hard task for the poor IT interns. At the same time, Microsoft came out with its Active Directory solution, to put their mark in the enterprise industry and take the throne from Novell’s Netware Directory Services (NDS). Although NT was relatively popular, when AD came out, a whole new set of possibilities was presented and Microsoft guaranteed its place in the market. With Active Directory came group policies, domain joined computers and servers and many other features. A great number of new services (and servers) became available and the number of server computers required started to grow (remember, no real enterprise level virtualization solutions were yet available).
Along with the growth, came the challenge of managing such a growing number of computers. Microsoft had a relatively known workstation product, but lacked a server side solution. That’s when, in the year 2000, Microsoft acquired the rights from NetIQ and created the first product in a line that still had some features its first ancestor had. Microsoft Operations Manager (2000) was the first of the Operations Management dynasty, culminating today with System Center 2012 R2 and its cloud relative, Operations Management Suite – Log Analytics.
The scenarios at the time were a bit simpler, though. Most of the essential components were there, like servers, network devices, workstations, etc. However, the complexity of the applications, the multiple layers, the internet exposure, nothing like that was there. And what existed, was not as deep or interconnected as today.
The internet has changed the way we do things and the server infrastructure had to evolve to allow that. So did the monitoring. The System Center Suite (named like in the 2007 release of the products) included more ‘web related’ monitoring, the capabilities of downloading management packs from a catalog. Companies started to use Operations Manager to monitor computers through the internet, giving birth to a new way of monitoring infrastructures.
Around 2010, Microsoft created a service called System Center Advisor, which was the first to take advantage of the cloud to directly interact with your on-premises workloads and provide advice about your infrastructure and applications. That service evolved to be, what today is called Operations Management Suite – Log Analytics, where the data is sent to the cloud to be analyzed and provide insight on issues and trends inside your infrastructure.
Currently, Microsoft continues to invest in the System Center Platform. System Center 2016 will be out in the second quarter of 2016 and will have some great improvements. The heavy and bulky investment, though, is being directed to cloud based solutions. There, virtually unlimited computer power and storage create a more agile analysis and configuration, as well as, a much lighter infrastructure to monitor your systems.