April 18, 2013By Ralph Kyttle
System Center Configuration Manager 2012 SP1 (SCCM) introduces many new important and valuable updates, one of which is the ability to manage Apple OS X computers. New to SP1 is the ability to deploy a SCCM agent onto Apple OS X computers that use an Intel 64-bit chipset, running Mac OS X 10.6, 10.7, and 10.8.
While not all features of SCCM are available yet for OS X devices, Microsoft has enabled some key features to get the ball rolling. At this time, these features include:
- Computer discovery
- Hardware inventory
- Software inventory
- Application deployment
- Configuration deployment and compliance
This feature set is a good starting point for managing OS X computers, but it is worth noting that there are some things to be aware of and plan for if you are looking to introduce OS X computers into your SCCM management infrastructure.
The first item to note is that client installation and management for Apple OS X computers in System Center Configuration Manager 2012 SP1 requires public key infrastructure (PKI) certificates. These certificates must be issued by a Microsoft Certificate Authority, so if you do not currently have a PKI solution deployed within your environment, this would be your first step towards enabling Mac management within SCCM.
Secondly, at this time there is no push install mechanism available for the Apple OS X client, so all OS X computers which will be managed by SCCM will require a manual install of the SCCM agent built for OS X. Because the OS X SCCM agent relies on a user certificate for authentication to a management point or distribution point, the end user will have to be present during the client install, because they will be prompted to specify domain credentials during the enrollment task. For more information on how to install clients on OS X computers in Configuration Manager, see the following TechNet article: How to Install Clients on Mac Computers in Configuration Manager.
In addition to the considerations that must be made to support the client rollout, it is important to be aware of a few differences between managing Windows and OS X devices. In the current configuration, OS X computers cannot take advantage of software deployments that are advertised as available, rather only required deployments are supported. So while you can provide a self-service portal to your Windows users to come and consume software at their needs, the same feature is not yet available for OS X. Also, while there is endpoint protection available now for Mac computers, it does not integrate with the SCCM console at this time. So for now, you can use SCCM to deploy endpoint protection to a OS X client, but there is no integration built into the SCCM console to manage antivirus policies on OSX computers.
While there are these known differences, it is critical to note that Microsoft has placed importance on providing SCCM 2012 as a single pane of glass to manage configurations across the various devices that exist within an IT environment. I am excited to see the components that are currently enabled and I have heard and I expect to see improvements made and additional features enabled as time goes on in terms of OS X management. If you would like to learn more about System Center Configuration Manager 2012 SP1 or how to setup Apple OS X computer management in your environment, contact New Signature to speak with one of our System Center experts who will be happy to provide further assistance!
April 17, 2013
WordPress Security: How to protect your site from recent brute force attacks and why your password shouldn’t be “password12345”By Zach Azar
As Frederic Lardinois made very clear in his recent blog post on TechCrunch, personal and commercial WordPress sites are under attack. The attacker’s strategy is simple: Keep guessing passwords until one opens the door. This is called a brute force strategy. You simply keep trying until you get it right. Once a password is guessed correctly, the attacker has full access to the backend of a site including the site’s architecture and the content contained in the site.
With a large network of computers at the attacker’s disposal, they can guess thousands of passwords from thousands of different IP addresses. This attack is not extremely technical. It is not “cracking the code” or performing incredibly elaborate hacks on the system. They just guess your username and password.
One reason that WordPress is being targeted is because the attacker knows half of your credentials already. When a new WordPress site is created, the first user has the username “admin.” Assuming you don’t change that name, the attacker already knows one piece of data (your username). Now they just need to guess the password.
This attack isn’t raising awareness that WordPress is faulty or that hackers are code geniuses; it’s reminding us that a major component of our defense against attackers trying to gain access to our websites consists of two strings of characters: your username and password. Thus, we need to take this defense seriously.
Using a strong password is crucial and simple. There are many forms of passwords that are considered strong. New Signature recommends using passphrases instead of passwords. What is the difference you ask? Put simply, a passphrase is a sequence of words that form a sentence with correct grammar, capitalization and punctuation. Passphrases deliver a number of benefits, including: (1) most sentences are naturally quite long, providing outstanding security against brute force attacks; (2) They are simple to create, reducing the burden when you have to change your password on a regular basis; and (3) they are very easy to remember, making it less likely that they will be written down and then compromised (or forgotten). For example, a passphrase could be: In 2013, I chose a new password for my website! This simple to type, and easy to remember passphrase is 47 characters long and super hard to crack! A lot better than trying to remember a random jumble of letters and numbers that many people resort to with long passwords.
If you want to, you can always use difficult passwords made out of random characters and digits, but remember to make these passwords at least 16 characters in length. There are sites which will create and customize these passwords for you securely. Passwords of this type are harder to remember though, so you may want to use tools like LastPass which will store and encrypt your passwords.
With either a passphrase or a password, make sure not to use words or phrases that have a connection to your site (i.e. if your site is about horses, don’t use the word “horse” in your pass phrase). Also, don’t use the same password for multiple websites. If your password is discovered, you don’t want the attacker to now have access to your online banking, email, calendars, etc.
Another major deterrent for these types of attacks is removing the “admin” user. You can simply create another user, make them an “Administrator”, and give them a difficult-to-guess username. Don’t forget to give them a nickname and select to display their name publicly as their nickname. You can now delete the original “admin” user (making sure to attribute all posts created by the admin to your new user or to another user) and voila! All attempts at guessing the “admin” password are useless because the “admin” user doesn’t even exist! Plus if the attacker does come across your new administrative user’s name, you will have created a strong password which will provide a strong defense against guessing attackers.
Congratulations! After following the steps above, your site is significantly stronger against the recent attackers and it only took a few minutes. Remember to update your site as soon as updates are available for WordPress, your plugins, and your theme. Of course if you had difficulty with the steps above, please feel free to contact New Signature for help.
These precautions are only the beginning to keeping your WordPress site secure. Talk with New Signature today if you would like to learn more about:
- Enabling custom two-step login authentication
- Securing administrative users and administrative access
- Utilizing and custom configuring powerful WordPress security plugins
- Securing direct communication channels to the server
- Examining and correcting file permissions for publicly available files
- Scanning entire sites for viruses and malicious code
April 16, 2013By Reed M. Wiedower
Now that Azure IaaS has reached general availability, for customers looking to move to online services such as Office 365, Dynamics CRM Online or Windows Intune, the question always comes up: which identity service should I use? If you followed our earlier explanation of Azure AD or the announcement this week that Azure AD had reached general availability one might be tempted to conclude that Azure AD was the correct identity solution in all cases. Such a decision for many customers would sacrifice the significant investments in on-premises AD that have made configuration and management much easier. And then there’s Active Directory Federation Services (ADFS)…which can help bring the power of federation to existing AD environments. Finally, with Azure IaaS customers can now spin up virtual machine that are domain controllers, extending on-premises AD into the cloud natively. With so many different choices, for many customers, the question remains: which do I choose?
Definition-wise, we should begin by noting the following:
- Regular AD on-premises involves domain controllers (DCs) running inside a corporate network spanning one or many sites
- Azure VMs can run domain controllers, and if connected back via Azure Virtual Networks, can serve as extensions of your existing AD on-premises
- Azure AD, by contrast, cannot “link” to your AD infrastructure except through Active Directory Federation Services (ADFS)
- ADFS can run on virtual machines built within Azure IaaS, meaning that you can combine both DCs and ADFS into instances that are connected to your network
- Very complex organizations may want to implement Forefront Identity Manager (FIM) to help synchronize different line-of-business and system-of-record systems within your organization.
Whew, that’s a mouthful! Fortunately, New Signature has helped organizations of all sizes select the proper identity management solution. We’ve broken down our recommendations into a simple pair of matrices: the first walks through the most common best practices, while the second walks through a feature-by-feature comparison to show which solution is the best.
Identity by Organization Type:
Size of Organization Notes Recommendation < 25 A new organization with no existing infrastructure Skip on-premises AD and go straight to Azure AD; use Windows Intune for endpoint management 25-100 An existing organization with minimal infrastructure (2-3 DCs) Use Azure AD for Online Services, and the new password sync components from Microsoft 100-2000 An existing organization with a single domain but multiple physical sites Use on-premises AD coupled with ADFS running within an Azure VM for maximum uptime 2000+ An existing organization with multiple domains and extensive AD infrastructure Use on-premises AD coupled with Forefront Identity Manager and ADFS spread across multiple sites
Features by Identity Services:
Feature On-Premises Active Directory Azure Active Directory Azure VM running DC role ADFS (either on-premises or in Azure VMs) Notes Single Sign On to Websites Possible if using ADFS as well Built-in Possible if using ADFS as well Built-in SSO is a breeze with ADFS: we recommend running ADFS on Azure VMs to reduce site dependencies if one is not using Azure AD Group Policy Built-in Not possible, yet. Built-in N/A If you want to use group policy, you’ll need to use regular AD or Azure VMs running a DC role. Alternatively, use an endpoint product such as Windows Intune to distribute policies. High Availability Possible if you add two DCs. Built-in. Possible if you add two DCs. Possible if you add multiple roles, to multiple sites. Other than ADFS, the other services are easy to add high availability. ADFS takes more of a lift, especially to span sites. Multiple Domains or Forests Built-in. N/A Built-in Supported Organizations with multiple domains or forests may need FIM for ease of management Support for Office 365, Dynamics CRM Online, Windows Intune Use ADFS. Built-in. Use ADFS. Built-in. If you want plug-and-play access to Microsoft’s online systems, use Azure AD or spin up ADFS.
As you can see, there are a myriad of factors at play, but the larger perspective is simple: small organizations that haven’t made an investment in AD should use Azure AD, while larger, more complex organizations with multiple domains will want to leverage ADFS, and at the highest end of complexity, Forefront Identity Manager, to continue to get the best value for their management needs.
By New Signature
Last Wednesday night NFTE DC held it’s 16th Annual Dare to Dream Gala! It was a magical evening of stories of how entrepreneurship changes lives. Guests were touched by the words of our students, teachers and Locally Grown Honorees, who included Christopher Hertz of New Signature, and inspired by the talented young business people at the Youth Showcase. Under the leadership of Gala Chair Cal Simmons, the event brought together more than 800 people and raised over $440,000 to support the 1,100 students NFTE serves across the Washington region.
New Signature is proud to support NFTE as they help close the opportunity divide be helping increase students’ entrepreneurial knowledge, with the ultimate outcomes of graduation, college attendance, business ownership and/or gainful employment.
By New Signature
PUBLISHED APRIL 16, 2013 on CRN
Microsoft Azure Cloud Service Challenges Amazon On Price, Reliability
By Rick Whiting
The general availability of Microsoft (NSDQ:MSFT)’s Windows Azure Infrastructure Services, which the company described as the final component of its cloud services lineup, puts the company in head-to-head competition with Amazon (NSDQ:AMZN) in the infrastructure-as-a-service market.
Microsoft also said it’s reducing the costs of its virtual machines and cloud services by 21 to 33 percent, promising to match Amazon Web Services (AWS) prices for cloud compute and storage services.
Microsoft partners that have been working with the cloud infrastructure services during its lengthy trial period say the move will help them offer customers lower-cost cloud application and development/testing services that promise higher reliability and uptime than on-premise IT.
“It really does allow you to be more agile as an organization,” said Reed Wiedower, CTO at New Signature, a Washington, D.C.-based solution provider that partners with Microsoft. As for the price cuts: “We’ve seen significant cost reductions across the board in the last one or two years with Azure,” he said.
While Microsoft is a player in the platform-as-a-service and software-as-a-service arenas, the general availability of Windows Azure Infrastructure Services puts Microsoft squarely in the IaaS market. The service allows businesses to move their Windows Server- and SQL Server-based virtual machines running on Microsoft Hyper-V — and the applications running on those VMs — to the cloud.
Microsoft has been providing the Azure IaaS service in preview mode since June. But, the general availability announcement means businesses can subscribe to the Azure IaaS service and get support and service-level agreements (SLAs) of 99.95 percent uptime.
“No one wants their VMs to fail. Effectively, I don’t have to worry that my virtual machines will go down,” Wiedower said, noting that New Signature is an Azure Circle partner and has been working closely with Microsoft on Azure development projects. “The Azure virtual machines are designed from the ground up to be fault tolerant. Microsoft has done a really good job in the last eight or nine months in detailing how their virtual machines work.”
There are currently more than 200,000 Windows Azure customers, according to Microsoft, and about 1.4 million virtual machines have been uploaded to Azure Infrastructure Services since the preview became available. But, Microsoft has a long way to catch up with AWS, which launched in 2006.
In a blog post, Bill Hilf, Microsoft general manager of Azure product management, said the new Azure service offers high-memory 28-GB/4-core and 56-GB/8-core virtual machine instances for running heavy-duty workloads. Also new are validated instances for SQL Server, SharePoint, BizTalk Server and Dynamics NAV, among other Microsoft software.
Wiedower said Microsoft’s new azure service fits with increasing demands he’s seeing for cloud-based development and test services that can cost far less than on-premise test and development projects. And, the New Signature CTO said he’s also seeing more demand from businesses that want to run Microsoft Active Directory in the cloud.
By Reed M. Wiedower
It’s graduation time here at New Signature, and we’re happy to announce that Azure Intrastructure as a Service (IaaS), including Azure Virtual Machines, has graduated to general availability.
In addition to the ability to now spin up highly available virtual machines, the Azure team simultaneously announced many other new features including:
- Pre-built images for popular applications such as SQL Server 2012, SharePoint 2013 and BizTalk Server 2013 to speed provisioning
- Larger virtual machines, including machines with 28gb and even 56gb of memory
- Bigger operating system drives, now up to 127gb in space
- Price drops of 21% to 33% across the entire Azure platform
There are a multitude of workloads that fit with the Azure Virtual Machine model. Many line-of-business applications are perfectly capable of running on Server 2008 or Server 2012, yet have never been migrated because doing so wouldn’t address the underlying dependencies on storage subsystems, networking or hardware. With Windows Azure, even single servers gain the ability to stretch storage and processing both intra and intersites, allowing customers to virtualize these workloads, park them in Azure VMs, and be confident that networking, storage or hardware problems won’t impact their availability.
Another popular cost organizations currently incur that’s a great match for Azure VMs are on-premises virtual machines used for development and testing. In the past, organizations would have to make large capital investments (either dedicated desktops for developers that rapidly lost value, or large virtual hosts that cost more, yet were never fully utilized 24/7 to recover the cost of investment) in order to meet the needs of their developers or system administrators. With Azure, instead of making a large cash outlay, organizations can invest a much smaller amount, provide the ability of developers to self-service their virtual machines as needed, and keep a tight rein on costs. If the organization decides that the money already invested would be better spent on storage, media services or even CDN, with Azure IaaS the VMs can be instantly frozen and the money used to fund those other priorities. By contrast, organizations that have spent large amounts of funds on virtual hosts, only to see them lightly utilized, have no recourse to the dollars already spent.
Finally, the most popular usage of Azure Virtual Machines to date that we’ve seen has been a desire to fully move key infrastructure components into the cloud, including Azure AD, and ADFS running on an Azure Virtual Machine. We’ll detail later this week the steps for organizations looking to migrate directory services into the cloud, and Azure VMs are a big part of that move. As a customer informed me, “Why would I move Exchange and SharePoint to the cloud, yet keep ADFS on-premises?” With Azure Virtual Machines, there’s no need to keep key servers running on-premises.
Interested in moving infrastructure into the public cloud? Looking for cost-savings, greater uptime *and* better flexibility? Talk to New Signature today to see how Azure IaaS can help bring your organization into the cloud.
April 15, 2013
Microsoft’s “GeoFlow” for Excel 2013 Delivers 3D Big Data Visualization and Storytelling Built on Bing MapsBy Christopher HertzLast week I was thrilled to see that Microsoft announced the preview availability of project codename "GeoFlow" for Excel 2013. GeoFlow is an awesome addition to Excel 2013 that lets you plot geographic and temporal data visually, analyze that data in 3D, and create interactive "tours" to share with others. This further builds on the value of Excel 2013 as the most popular and accessible business intelligence tool available. GeoFlow adds to the existing self-service Business Intelligence capabilities in Excel 2013, such as Microsoft Data Explorer Preview and Power View, to help discover and visualize large amounts of data, from Twitter traffic to sales performance to population data in cities around the world. To get started today, download the Add-in for Excel 2013 with Office 365 ProPlus or Office Professional Plus 2013. With GeoFlow, you can:
- Map Data: Plot more than one million rows of data from an Excel workbook, including the Excel Data Model or PowerPivot, in 3D on Bing maps. Choose from columns, heat maps, and bubble visualizations.
- Discover Insights: Discover new insights by seeing your data in geographic space and seeing time-stamped data change over time. Annotate or compare data in a few clicks.
- Share Stories: Capture "scenes" and build cinematic, guided "tours" that can be shared broadly, engaging audiences like never before.
By Christopher Hertz
Many people are surprised to learn that the most popular business intelligence (BI) tool in the world is Microsoft Excel. Microsoft Excel has long been an essential tool for the modern worker, and with each successive release Microsoft has provided more efficient ways for users to access, understand and present their data. In Excel 2013, part of its latest Office suite upgrade, Microsoft has placed increased emphasis on BI functionality. If you aren’t familiar with business intelligence capabilities in Excel please review this page, and you can also learn about what’s new in Excel 2013 here. If you use Excel regularly, it is worth exploring the depth and breadth of capabilities available “out of the box” with Excel 2013. For example, if you haven’t created a PivotChart before, take a look at this quick walk through on how to create one in Excel 2013. A PivotChart will help you unlock the big picture stored in the big data in a PivotTable or when you have a lot of complex worksheet data that includes text and numbers with column headings. If you haven’t ever explored creating a PivotTable to anaylze woksheet data, you can follow this tutorial.
April 12, 2013By Reed M. Wiedower
In this month’s update to the free-for-all, cloud-built System Center Advisor (SCA), Microsoft has tweaked the service to allow integration with System Center Operations Manager (SCOM), which can help your organization be perfectly aligned with Microsoft best practices.
In the past, customers who were already using the power of SCOM to monitor their mission critical servers were able to build configuration baselines on their own, but if they wanted to harness Microsoft’s best practices, had to make a choice between the best practices power of SCA and the real-time monitoring functionality of SCOM. Now, with the new System Center Advisor Connector existing SCOM shops can be made aware whenever server drift away from Microsoft best practices, even when the only change is on Microsoft’s guidance itself! As we all know: it’s far better to be proactively alerted that an Exchange server has drifted from a solid configuration via SCA than to discover backpressure impacting mail delivery several days later, reactively, via SCOM.
Best of all, this means that shops with SCOM no longer need to install a separate gateway service to enable SCA, as the existing SCOM server can act as the gateway, forwarding requests between the management server and SCOM itself. That’s right: SCOM won’t *stop* SCA from collecting information, so if you’re used to using the SCA website, you can still continue to monitor the health of your network from anywhere with an internet connection, in real-time. Interested in learning more about this functionality? Reach out to New Signature so we can show you how SCA + SCOM can help drive down the costs of keeping your servers healthy.
April 11, 2013By Christopher Hertz
I was excited to see in late March that Microsoft announced the commercial availability of Microsoft System Center Global Service Monitor–the Windows Azure-based service that provides web application performance measurement from a user’s perspective. This services is now available to Microsoft customers with active Microsoft Software Assurance coverage for their System Center 2012 server management licenses.
What is really neat about Global Service Monitor (GSM) is that it extends the application monitoring capabilities in System Center 2012 beyond your organization’s network boundary. GSM leverages Windows Azure points of presence to provide visibility into an end-users’ experience of a web application from different geographic locations. Global Service Monitor reports on availability, performance, and function of web applications by scheduling and executing synthetic transactions against the application from these Windows Azure points of presence. This creates a 360-degree view of the health of your web applications and best of all this data is paired with existing data in the familiar System Center 2012 Operations Manager console. It does require that you have the System Center 2012 Service Pack 1 Operations Manager component installed locally.