April 11, 2013By Christopher Hertz
I was excited to see in late March that Microsoft announced the commercial availability of Microsoft System Center Global Service Monitor–the Windows Azure-based service that provides web application performance measurement from a user’s perspective. This services is now available to Microsoft customers with active Microsoft Software Assurance coverage for their System Center 2012 server management licenses.
What is really neat about Global Service Monitor (GSM) is that it extends the application monitoring capabilities in System Center 2012 beyond your organization’s network boundary. GSM leverages Windows Azure points of presence to provide visibility into an end-users’ experience of a web application from different geographic locations. Global Service Monitor reports on availability, performance, and function of web applications by scheduling and executing synthetic transactions against the application from these Windows Azure points of presence. This creates a 360-degree view of the health of your web applications and best of all this data is paired with existing data in the familiar System Center 2012 Operations Manager console. It does require that you have the System Center 2012 Service Pack 1 Operations Manager component installed locally.
April 10, 2013By Reed M. Wiedower
Another big announcement from MMS: Microsoft has released the 2013 version of the Microsoft Desktop Optimization Pack, importantly containing the Microsoft BitLocker Administration and Monitoring (MBAM) 2.0. We’ve been a huge fan of MBAM since it was originally released, and the new features in version 2.0 make the upgrade incredibly compelling. MBAM 2.0 includes:
- Built-in support for System Center Configuration Manager 2007 and 2012
- A new self-service portal to allow staff to solve their common problems without needing assistance
- Easier provisioning with Windows 8
- Better compliance reporting with less false-positives
MBAM allows organizations to rapidly enable BitLocker (full disk encryption) for endpoint devices (usb keys, tablets, laptops and desktops) and to get great reporting on whether those endpoints are fully protected. If you haven’t taken advantage of the new security features built into the Windows 8 Enterprise operating system, they can significantly reduce the total cost of ownership for managing those devices, all while increasing security and reliability. You’ll no longer need to worry if a device is lost or stolen, as you can be sure that the data on the device is reasonably protected against theft, backed up with real-time reporting for your security team. Reach out to New Signature today to learn how quickly you can implement.
By Peter Day
What is Data Deduplication?
Windows Server 2012 data deduplication (often shortened to “dedupe”) is a software-based technology that allows you to most efficiently maximize your data storage space. No additional hardware is needed for the deduplication to take place. The basic concept is that if you have multiple copies of the same document, then you only need to store it once. In any file system, that technology could save a lot of space. This is also a back-end feature – employees will not notice that their documents are being deduped, and there is nothing they need to do differently for it to work.
How does Data Deduplication work?
The basic premise is that if you have five identical files of data, then you could just store one copy on disk, with the four other copies actually being a small pointer to the first copy, as opposed to the full file. This seems pretty straight forward when you are talking about identical copies of the same file stored in different directories. However, one new useful feature of data deduplication in Windows Server 2012 is that it works on segments of files (called “chunks”). So if the first 25% of each of 10 files is identical then Server 2012 can deduplicate just that portion of the files. This makes data deduplication ideal for such scenarios as a disk holding an archive of Virtual Machine hard disks that all have a version of the Windows operating system on them.
Because deduplicating documents that change frequently would be inefficient, by default Server 2012 waits for 5 days since the last change before it attempts to deduplicate the file. There is also an exclusion list so you can specify certain folders or file types that should not be processed if you feel they would not benefit from deduplication.
What about the impact on the server?
The team at Microsoft have tested data deduplication under load and found that Windows Server 2012 can provide deduplication services to a file server without any noticeable loss of performance in opening documents for the end user. The service is designed to run in the background and will automatically pause when server resources run low. You can also schedule what times it is to run.
What are the requirements for Data Deduplication?
First you’ll need Windows Server 2012. While you can run a deduplication analysis on a Server 2008 R2 file server you can only use the deduplication service once you upgrade to Server 2012. You should also be aware that Server 2012 will not support data deduplication on system or boot drives.
Can I still use BitLocker?
Yes, you can still use BitLocker to provide whole-disk encryption on a deduplicated disk since BitLocker sits beneath the deduplication software. However if you have a regular disk with some individual encrypted files on it then those files will be ignored by the deduplication feature.
What is an ideal use of deduplication?
It is best suited to scenarios such as general file shares or software deployment shares. Due to the rapid changing of their data stores this technology is definitely not recommended for database or email servers. It would also not be advisable to run it on a Virtual Machine host.
How do I get started?
As with any major change to your data, the first thing to do is to make a verified backup of the drive you will be running data deduplication on. The second thing to ensure is that after the deduplication you are still backing up the data – deduplication does not provide any protection against disk failure or loss of data for other reasons. In fact deduplication makes your storage more complex under the hood so you especially want to be sure you are backing up your data.
How do I evaluate the potential benefits?
Before you commit to using the service you can use the DDPEVAL.EXE tool to determine what your space saving might be. This tool allows you to evaluate remote shares and mapped volumes, however you should be aware that the deduplication software itself will only work with volumes local to the 2012 server with the deduplication feature enabled.
Are there any risks?
Any disk can suffer from issues such as physical failure or environmental damage. If a sector of the disk becomes corrupted and it stores a popular chunk of data then dozens or even hundreds of files could be affected if they all used that chunk. The data deduplication software has several features to lessen the impact of disk corruption. For example, it stores duplicate copies of very frequently used chunks, and would use the duplicate if the original chunk becomes corrupted.
Is the data portable?
Yes, each deduplicated volume is a self-contained unit. Everything needed to access your data is stored on the drive. So you could backup a deduplicated volume in one server, and then restore it to another 2012 server with the feature installed and be able to read the deduplicated volume on the second server.
Where can I find out more?
New Signature has a wealth of experience with Server 2012 and we would be happy to review your current data usage and help you plan a deduplication strategy. If you need help with those essential backups, we can help with that too, so please give us a call.
April 9, 2013By Ralph Kyttle
It is now Tuesday evening in Las Vegas, and it has been another busy and eventful day. Yesterday, I described some of my reactions to the keynote presentation given by Brad Johnson, and I mentioned that Microsoft has announced many important updates to System Center and its cloud platform. Technology has always moved at a fast pace, but it seems as cloud technology has become ever more prevalent in the IT landscape, updates and changes to IT have continued to increase in frequency, and MMS is no exception. While I am still trying to wrap my head around the excitement of these changes, here is a list of some items I think deserve attention:
• Global Service Monitoring with System Center Operations Manager 2012 SP1: Monitor your web applications from the outside in using Microsoft’s Global monitoring infrastructure! Free for all who using Operations Manager!
• Application Performance Monitoring and Visual Studio Web Testing: Improvements have been made to APM, and Visual Studio Web Tests can easily be created and loaded up to Global Service Monitoring to perform custom tests against your web applications
• Deploy a complete System Center solution using Powershell
• System Center Advisor integration with Operations Manager 2012 SP1
• Automating the upgrade of System Center Configuration Manager 2012 to SP1 using Orchestrator
• Windows Azure Active Directory, as Christopher Hertz mentioned in his earlier blog post
In addition to learning about new features that have been released or are in the pipeline, I spend a large portion of my days here in breakout sessions learning new things about System Center from others in the IT field. One thing I would like to call out now is a little noticed option within CMTrace, the SCCM log file viewer. This option is called “Merge selected files” and is found in the open file option within CMTrace. This option allows an administrator to select multiple log files and open them all at the same time in one CMTrace window. CMTrace will automatically organize the log files based on timestamp, and this little known feature can prove to be very useful, as it allows you to open multiple log files at the same time, and get a clear picture of what is happening during a management event in your environment, and prevents an administrator from having to open log files one at a time, trying to piece together an often complex task.
As the news keeps rolling in, I will continue to provide more updates, and hopefully once things slow down a bit, I will spend some time on the items I listed above to go into more detail on my feelings on them.
If you couldn’t make it to MMS this year, there are still ways to connect and learn about the exciting new updates that Microsoft is making to its cloud platform. If you are on twitter, follow the hash tag #mms2013 but be prepared for a never ending supply of news and updates! Videos from the MMS sessions are also being posted on the MSDN Channel 9 website, and are free for all to view!
By David Trejo
Hyper-V Replica (HVR), built into every copy of Windows Server 2012, has proven to be a fantastic DR solution for a variety of scenarios including recent high-profile natural disasters such as Hurricane Sandy. While HVR can be managed using Hyper-V Manager, or using a combination of System Center Virtual Machine Manager (SCVMM) and System Center Orchestrator 2012 SP1 Runbooks, administrators have requested full integration between the technologies. To provide this functionality the Windows Azure team has introduced Hyper-V Recovery Manager (currently in preview, along with Windows Azure Backup).
Windows Azure Hyper-V Recovery Manager can help you protect important services by coordinating the replication and recovery of System Center 2012 private clouds at a secondary location. To do this, Hyper-V Recovery Manager provides cloud-based management and coordination of SCVMM 2012 and Hyper-V Replica deployments across multiple private clouds. Very exciting new service that New Signature is looking forward to seeing hit general availability!
April 8, 2013By Ralph Kyttle
Monday officially marked the start of MMS 2013, Microsoft’s Management Summit, which focuses on Windows Server, System Center and Cloud technologies. The conference is being held in Las Vegas at the Mandalay Bay Hotel. While there were a few teaser sessions on Sunday this year, Monday officially marked the start of the conference. Different from last year, the conference started with the keynote, which set the themes for the week, showcased some great demos and provided some interesting announcements around System Center technologies.
Like everything in Vegas, the keynote was big. Held in the Mandalay Bay event center arena, the event looked like a concert at first, with flashing lights, and a DJ on stage playing music for the crowd. Coinciding with the grandness of its introduction, the keynote displayed many important ideas about Cloud technology and how Microsoft envisions the future. The keynote was given by Brad Anderson, Corporate Vice President of Windows Server & System Center, along with the help of several Microsoft product managers leading the various demos.
Through the keynote, something that stuck with me is the idea that every business can benefit from cloud computing, as the “cloud” is not just the public cloud, but is an IT model that can be applied to deliver technology within an organization. While this is a time of massive transformation in IT, Microsoft has created an all-encompassing way to introduce cloud computing into your environment, with Windows Server 2012, System Center 2012, Microsoft SQL Server and Windows Azure. An important takeaway that was stressed multiple times during the keynote was the benefit that Microsoft provides through its cloud offerings, as Microsoft runs some of the largest public cloud platforms anywhere in the world today, including Windows Updates, XBOX Live and Windows Azure. Microsoft is able to apply the knowledge it gains from running its many cloud services to the System Center line of products, enabling users of System Center to benefit from what has been learned from those public cloud infrastructures. It was said that the same engineers that built the public cloud environments have worked to develop the products used to build private clouds, so the knowledge is able to directly transfer from those large scale experiences.
Another important takeaway was consistency. Since both the public cloud and private cloud offerings are being designed in tandem, the consistency allows you to move from one cloud to another very easily. An example of this is the ability to utilize Windows Azure to add additional capacity to your environment, and migrate virtual machines between your on-premise environment up to Microsoft’s public cloud. Another point of consistency was the importance of using templates and automation, to enable both self service and a reliable delivery method for deploying new services.
Microsoft has already announced many new exciting updates to its public and private cloud offerings at MMS. Look for some more updates as the week continues, as I will highlight some of the features I find most interesting, and which will be of most value to businesses looking to move forward with the cloud!
By Christopher Hertz
The Windows Azure Team has been busy today, releasing Windows Azure Backup Preview and also announcing Windows Azure Active Directory reached general availability and is now ready for use in production environments.
By way of quick background, Windows Azure Active Directory (a.k.a., Windows Azure AD), enables you to maintain a single identity that can be used to access applications that are run on Windows Azure or within your own data centers by federating with you on premise directory. In addition, you may already be using Windows Azure AD and not even know it. This is because Windows Azure AD is the authentication resource for Office 365, Windows Intune, and other Microsoft online services. If you’re using these services, you’re already using Windows Azure AD!
The best news of all is that Windows Azure AD including the base directory, Tenant, User & Group Management, Single Sign On, Graph API, Cloud application provisioning, Directory Synchronization and Directory Federation, is available at no charge. This is an enormous win for you because historically, Microsoft has charged for Access Control based on the number of transactions and now they are making it a free benefit of using Windows Azure. Thank you Windows Azure team! It is important to note, that Microsoft does intend to charge for certain additional capabilities such as Azure AD Rights Management and these will be available as separately priced options.
There are a lot of big benefits with Windows Azure AD, but here are a few of the highlights:
- One identity, many applications. With single sign-on enabled, your existing corporate accounts can access resources within your company, plus access cloud applications seamlessly.
- Easy integration with Microsoft Office 365. Are you moving productivity to the cloud? No worries. Windows Azure AD interoperates with Office 365, so setup is quick, and it can federate users with existing directories if needed.
- New apps, a single identity. The cloud offers new ways to develop and deploy applications quickly. Now it’s easy to secure those applications, too. You can employ simple, standards-based developer interfaces that provide secure access and deliver single sign-on for your users.
By Christopher Hertz
New Signature is excited to be working with the new Windows Azure Backup preview and have already been putting this great new cloud service from Microsoft through the paces. So far the experience has been great and we congratulate the Windows Azure Team for putting together a great new service.
For those unfamiliar, Windows Azure Backup protects important server data off-site with automated backup and restoration and best of all you can manage cloud backups from familiar backup tools in Windows Server 2012 or System Center SP1 Data Protection Manager. With incremental backups, only changes to files are transferred to the cloud. This helps ensure efficient use of storage, reduced bandwidth consumption, and point-in-time recovery of multiple versions of the data. Configurable data retention policies, data compression, and data transfer throttling offer you added flexibility and help boost efficiency. Authorized users can easily recover backups to any server.
As mentioned earlier, Windows Azure Backup is currently in preview and as of April, 9th 2013 here is the published pricing for Windows Azure Backup:
Compressed Data Stored per month Price (Preview) Price (General Availability) First 5GB $0 $0 Each GB over the first 5GB $0.25 per GB per month $0.50 per GB per month
Windows Azure Backup is billed in units of the average daily amount of compressed data stored (in GB) that exceed 5GB over a monthly period. For example, if you consistently utilized 20 GB of storage for the first half of the month and none for the second half of the month, your average daily amount of compressed data stored would be 10GB for that month. Since the first 5GB each month is included at no charge, your bill for that month would be $1.25 (5GB x $.025) for that month. The amount of storage for which you are billed is determined by the compression ratio and the number of backups retained. You will only be billed for the amount of data stored in the Backup service. You will not be charged for bandwidth, storage, storage transactions, compute, or other resources associated with providing the Backup service.
You can find current information on pricing online.
April 5, 2013By Stephen Dobeck
Any Exchange administrator who has been around long enough has probably experienced an email outage due to database, server, or site failures at some point in his or her career. When this type of disaster strikes, getting email up and running again is a top priority for you, your IT staff, and the rest of the company. In today’s always-on, always-connected society, email outages are one of the most noticeable types of IT failure. Working with Exchange databases (especially large database that run in the hundreds of gigabytes), recovering servers, can be time consuming, and most employees do not want to be stuck without email during the hours or days this maintenance can take.
In this scenario, one of the commonly-used recovery methods is a dial tone recovery. In this scenario, IT staff create a temporary mailbox database on the Exchange server, which allows employees to immediately send and receive new email, as well as access any old emails that were cached in Outlook. However, a dial tone recovery can only be used in the event of a single database failure or if a company has multiple mailbox servers. Many small businesses that experience failures only have a single older server which limits recovery options – what happens then?
New Signature has worked successfully with a number of small businesses in exactly the scenario described above. Luckily, by harnessing the power of the cloud, it is possible to perform a similar type of mitigation and recovery strategy by using Microsoft’s Office 365 hosted email service as a second server. In the event of a major server failure, employees stop being able to send and receive emails. As soon as IT determines that there is a major failure (i.e. something that cannot be recovered in a matter of hours), the high-level steps to mitigation are as follows:
- Sign up for an Office 365 account and purchase the necessary number of licenses
- Add in your company domain name that is used to send/receive email and verify ownership by creating the necessary DNS records (either TXT or MX records)
- Create email accounts for impacted employees and apply licenses
- Update MX records to redirect incoming email, update TXT records to allow verification of SPF checks, and update CNAME records to setup autodiscover functionality
- Distribute new login information to staff and assist with setup
The limiting factor in speed for recovery is the propagation of DNS records, which both verify ownership and redirect incoming email. If you have a good DNS host, this recovery process can be fully completed in under an hour, allowing employees to be sending and receiving new email with minimal disruption. In the event that the cloud migration will be permanent old mail can be migrated to 365 via .PST or a specialized migration tool from the recovered mailbox database. If a new server is obtained, MX records can be redirected to the on-premise server.
A dial tone recovery takes pressure off IT staff during a catastrophic failure and allows them to focus on recovery efforts while leaving employees mostly functional. New Signature has experience in recovering from these types of failures, so please contact us if you need further assistance.
April 4, 2013By Dan Fink
Continuing the Spreadsheet to SharePoint series (which started with this post, in case you missed it), we will add some further customization to the SharePoint list, to more closely match the features of the spreadsheet. This is a rather short post highlighting some display specific changes to the Warehouse Inventory. In the next post, we will modify the default forms, adding a little more panache.
The spreadsheet still displays data a little more clearly than our SharePoint list, so lets add some formatting and totals to our Inventory list, making it a little easier to read, and more closely matching the functionality of the spreadsheet. We can format columns in SharePoint as several different types. In most cases, we want just a plain number, but in the case of our Unit Cost and Inventory Value columns, we want the inventory to show as currency. As you’ll recall from the first post in this series, we set the Inventory Value as a calculated column, which prompts us when creating the column to set the type. It did not however do this for the Unit Cost column, so we will go through that now. Changing the type can be easily accomplished from within the SharePoint web interface. We will begin by navigating to our Warehouse inventory list, and using the List Tools ribbon to access the List settings.
From here we can see a list of all the columns we created. Click the Unit Cost column, and simply change type to Currency. There are a few additional column settings which can be modified, however the defaults for these will meet our needs.
After this, we will replicate the Total Inventory Value, which was at the top left of the original spreadsheet. By default, it displays at the top of the column that is being totaled. You can change where this displays with some further customization, but that is outside the scope of this post. To add the total to the top of the Inventory Value column, we will have to modify the view of the Warehouse Inventory. To do this, go back to the list itself, and use the Modify View button in the List Tools.
From the Edit View page, we need to scroll way down near the bottom, expand the Totals section, and tell it to Sum the Inventory Value column. At this time, we can see that an additional feature we could enable is the Count total of the SKU column. This will add in the Inventory Items which was at the top right of the original spreadsheet. With the Totals set as shown below, we will now have these two features also matched from the spreadsheet, getting us closer to the original functionality.
With those changes made, we can now have a look at our inventory list as it current stands, matching nearly all of the functionality of the spreadsheet, both of which are shown below. Obviously the SharePoint list could have a few more design improvements, which maybe we’ll tackle in a future post.