Blog

  • June 10, 2013

    7 Search Engine Optimization tips for 2013

    Search Engine Optimization, or SEO, has always been a challenging undertaking for webmasters and content managers alike. There are technical considerations along with general content generation and publishing concerns. Both black-hat and white-hat strategies have varying levels of success and underlying ethical concerns have been presented over the last 15 years. However, there is and never has been one magic bullet. The best way to develop a SEO strategy is to think holistically.

    The metaphor that I always have used with clients is that their site should be like a houseplant, keep it in sunlight with good tagging and page descriptions, water it on a regular basis with new content and occasionally repot it with an aesthetic or structural refresh.

    These seven tips are a great way to start a SEO strategy for any size site, personal or enterprise, and should be the foundation for any more complex implementations in the future.

    Create great, shareable Content
    The best key for unlocking your site’s SEO potential is to create great content. Nothing can supersede this in terms of overall site optimization. Experts recommend at least 3 posts a week which should contain a mixture of text, images and video. The more this content is shared the better your rankings will get.

    Play to your niche
    It is quite likely that your site has a specific user profile. Whether they are business people or noncommercial users, one great practice to employ is playing to your site’s strengths, especially with metatags and page descriptions. Deploy specific terminology and descriptions that your users are likely to use.  If you happen to have well known experts creating content for your site, make sure their names are being entered as tags. Also create links to sites and blogs operating in the same sector to further build the community at large while gaining credibility for your content.

    Leverage Social Media
    The top five social media networks are: Facebook, Twitter, Google+, LinkedIn and Pinterest. For outcome you should join these networks and be as active as possible. While updating 5 different media networks can be challenging, simplify your life and use tools like Twitterfeed to have articles on your site automatically populate to your social media sites through RSS feeds.

    Be Detail oriented
    Create solid keywords, page descriptions, page titles, tags and navigational links. Use your niche taxonomy terms within these SEO fields in order to solidify these terms in the engine’s term sets.

    Solicit Quality links
    Google’s content spiders place huge value on links from other sites to your content. Link building can be time and effort intensive, but can reap huge rewards. You can find multiple techniques here, but contacting sites that you use or reference on a normal basis and asking for link backs is a winning strategy too.

    Optimize your video
    Everyone agrees that video is the fastest growing facet of the internet. Well produced, informative video can quickly help build a following for your content and utilizing YouTube, Vimeo and other sites can help drive traffic through channels you may never have considered. It helps though to have your video be optimized for SEO. This article explains the details on doing so.

    Use Google’s tools
    Advertising and SEO are Google’s business and they want your site to succeed, so they provide a large array of free tips and tools to help.

    • Google Analytics
      Google Analytics is a very robust web app that lets you measure, to a dizzying level of detail, site traffic, visitor types, advertising ROI, content visibility and overall site popularity.
      www.google.com/analytics/

    These tools and tips are a great starting point for anyone interested in improving their site’s search engine rankings and placement. There are also hundreds more articles available online for readers that have exhausted these resources.

  • June 6, 2013

    Troubleshooting shell extensions in Microsoft Windows

    I came across an interesting problem the other day - my Windows 7 computer would periodically freeze without any obvious cause.  Eventually I narrowed it down to the fact that explorer.exe froze and hung whenever I did a right-click on an object to bring up the context menu.  The problem occurred with both right-clicking using the mouse and also with using the context-menu key on the keyboard so I knew it wasn’t a mouse driver issue.  The next place I checked was the Windows registry, by starting up regedit.exe and going to the following location in the registry:

    HKEY_CLASSES_ROOT\*\shellex\ContextMenuHandlers

    First I exported that registry key to a backup file so I could restore it afterwards.  Then I removed the folders under that registry key one by one, checking after each one whether a right-click still froze explorer.exe.  In the end I had removed them all and the problem still persisted, so I restored the copy of the registry key I had exported earlier.

    After some searching I hit lucky when I came across ShellExView by Nirsoft.  The software is available for free download from http://www.nirsoft.net/utils/shexview.html and is a very handy utility.  The viewer is a simple looking program that shows you all of the shell extensions installed on your computer – there are probably more than you think.

    The basic approach here is to disable the extensions one by one until the problem is solved.  However that approach could take considerable time so a more strategic approach will usually pay dividends.  First I started with the entries of type “context menu” as that seemed to be the source of the freezing, and I ignored those extensions published by Microsoft.  Then I went through the half-dozen that were left, disabling them one at a time.  Once I disabled the shell extension for the most recently installed piece of software the problem was resolved and I was able to right-click in Windows explorer without it freezing.

    Hopefully this post will help you if you encounter similar issues.  New Signature has many years of experience with troubleshooting Windows systems, so if you need additional help please do give us a call.

     

  • June 4, 2013

    Exchange Online Mail Contact with Internal and External Email Address

    When adding a contact to Exchange Online, you may find the need to assign an email address to your tenancy’s domain, that both internal and external senders use. You would think that just setting the Alias would automatically make an SMTP address available, but this is not the case. These email addresses need to be set with PowerShell. An alternate method to achieve the same goal would be to make two contacts, one for the internal email address and one for the external, then a hub transport rule to forward them. The hub transport method will work, but adds quite a bit of complexity and overhead for such a simple goal.

    Below is an example of this case, and the command required to set the contact’s email address under your existing domain.

    Scenario
    User John Doe works with The Client, but does not need a mailbox. John requires an email address of jdoe@theclient.com, which forwards to his john@doe.com external email address.

    Solution
    Create a new contact with the GUI, or via the New-MailContact cmdlet. Even though you set the Alias, the @theclient.com email address is not automatically created. You must use PowerShell to set the MailContact’s EmailAddresses.

    Set-MailContact –Identity jdoe –EmailAddresses SMTP:jdoe@theclient.com,john@doe.com

    Now any messages sent to jdoe@theclient.com will automatically be forwarded to john@doe.com. You can then modify additional settings, such as hiding the contact from the GAL.

  • June 3, 2013

    Development at 50,000 feet: Azure Testing Improvements

    ​Developers often like to consider themselves capable of great feats of legendary skill, able to build airplanes while flying, even when hobbled by a lack of time, executive sponsorship, or the funds to go out and procure a third party solution. After all, shouldn’t every developer be able to code a solution to her problem?

    There is one key area that stands out in opposition to this can-do attitude, namely, the resources needed for testing. Every developer I’ve met craves having additional machines to test on. No amount of memory is ever enough, no processor new (or quick) enough, no amount of network bandwidth or latency too wide or quick enough! With the advent of virtualization, many system administrators were overwhelmed by requests for servers, switches and complex storage arrays merely to run testing workloads. All that equipment can get expensive fast, and the worst part is that as soon as it is procured it is already losing value.

    With Azure, the development and testing problem is turned on its head: instead of developers bugging sysadmins for resources, they can simply leverage their existing MSDN subscription, spin up a development machine, or two, or three, or twenty. Today’s announcement from Scott Guthrie covers several major changes that will help developers and system administrators alike utilize Windows Azure for testing, including:

    1. Virtual machines that are shut off no longer cost money (ie, you don’t need to delete them to reclaim the cost)

    2. Billing is now done by the minute, rather than by the hour

    3. As developers get closer to their free limit, they’ll be alerted in an easy to see way

    4. The overall pricing on Azure is cheaper for MSDN developers

    The key takeaway is this: it’s now not only incredibly easy to spin up VMs in Azure, it’s also inexpensive to boot, and for many testing scenarios, completely free. Developers as well as sysadmins will be able to test to their hearts content, and then when they need to scale up, will be able to easily justify the investment. Best of all, if those development boxes aren’t needed for a project, your organization will be able to repurpose the dollars elsewhere. Good luck doing that with the racks of testing servers purchased in the recent past!

    Interested in moving your Windows and Linux server workloads into the cloud? New Signature, as one of a handful of Azure Circle partners with Microsoft, can ensure you learn all the best practices to reducing your on-premises dev/test environment and gain a truly elastic, robust experience.

  • Azure Virtual Networks Gain Steam

    One of the new changes mentioned by the Windows Azure team this morning has been a large expansion of supported networking scenarios to include products from Watchguard, F5 and Citrix. Previously, Microsoft supported many VPN endpoints as gateways, but only provided configuration scripts for a narrow range of Cisco and Juniper equipment, as well as the ability to use software VPN connections from Microsoft RRAS.

    With the new supported models added into the mix, Azure can now provide both dynamic and static routing, enabling business of all sizes to start taking advantage of Azure Infrastructure as a Service (IaaS), without requiring a switch in networking hardware that could delay implementation. Thus, the final hurdle for many organizations has come down, allowing all to move quickly to spin up Azure Virtual Machines in a matter of minutes, rather than days or weeks.

    Interested in seeing if your organization can move services to the cloud? Need a dedicated guide to show the best practices along the journey? Reach out to New Signature today so that we can show you how best to setup a test implementation that can be moved into production with a minimal amount of work.

     

  • May 30, 2013

    Reclaiming space from the winsxs folder to alleviate disk space issues in Server 2008

    a large winsxs folderAs a server administrator you have probably found yourself troubleshooting issues with shortage of disk space on Microsoft Windows Server 2008.  Servers that were built years ago may now be filling up and the system drive critically short on space.  Ideally you want to have at least 10% free space on the drive where Windows is installed (the “system drive”) – if there is less than that then action is most likely needed.

    The first tool I reach for in situations like these is the excellent Tree Size utility from Jam Software (http://www.jam-software.com/treesize_free/).  The utility is less than 5 MB in size and can thus be installed on almost any server that is short of space.  The Tree Size utility will scan the drive you select and tell you which files and folders are taking up the most space.  You could start by looking for log files that you know are no longer needed or copies of ISOs that you used for installing software in the past – all these could be moved to a non-system drive on the same or a different server.

    However servers that fill up tend to do so repeatedly and eventually you will be looking at the \Windows folder and wondering if there is anything in there you can excise?  Right away you will spot the \Windows\winsxs folder which we have found can easily be up to 15 GB in size.

    What is winsxs?

    The “winsxs” folder, also called Windows Side-by-Side or “the content store”, is located in the root of the \Windows system folder.  The first thing to say is:  don’t delete it! Or anything in it!  At least not manually. We take a look below at how to recover some of the space. The winsxs folder is designed to replace the traditional installation media (e.g. DVD) and it is what allows you to install extra features, software and roles on your Server 2008 machine without having to provide the installation media. However the winsxs folder can be 10 – 15 GB and will only grow over time as updates and service packs are applied to the server.

    What can I do to shrink the winsxs folder?

    Server 2008 R2 introduced the DISM (Deployment Image Servicing and Management) utility which you can apply to recover space used by files that are not needed after a service pack is installed. Note however that this makes the service pack permanent – you will not be able to uninstall it in the future if you need to do so as part of a troubleshooting process. Here’s the command you would use at an elevated command prompt:

    C:\Windows\system32>dism /online /cleanup-image /spsuperseded

    If the server is running the original Server 2008 (non-R2) then you have two options depending on which service pack you have installed.  Note that both utilities are found in the \Windows\system32 folder and, like the DISM tool above, both will make the service pack permanent.

    • For Server 2008 SP1 use: vsp1cln.exe
    • For Server 2008 SP2 use: cmpcln.exe

    In both cases run the utility at an elevated command prompt and select Y to start the cleanup.

    In real-world examples we have seen the above utilities free up 2 GB or more of disk space, which is a great start towards getting your server running smoothly again.

    What else can I do?

    New Signature has many years of experience with managing and maintaining Windows Servers. We would be happy to help if you are having capacity, or other problems, with your server, so please do give us a call.

  • May 25, 2013

    New Signature wins 5 Awards in the 19th Annual Communicator Awards

    The New Signature team has added five additional honors, with an Award of Excellence (Gold) for the Ask Herzl website, and Award of Distinction (Silver) for AAAS MemberCentral, Lowy Institute, Barrel of Jobs and Working America.

    19th Annual Communicator Awards is the largest and most competitive awards program honoring the creative excellence for communications professionals with over 6,000 entries received from across the US and around the world.  The Communicator Awards are judged and overseen by the International Academy of the Visual Arts (IAVA), a 600+ member organization of leading professionals from various disciplines of the visual arts dedicated to embracing progress and the evolving nature of traditional and interactive media.

    “We are both excited and amazed by the quality of work received for the 19th Annual Communicator Awards.  This year’s class of entries is a true reflection of the progressive and innovative nature of marketing and communications,” noted Linda Day, executive director of the International Academy of the Visual Arts.  She added, “On behalf of the entire Academy I want to applaud this year’s Communicator Awards entrants and winners for their dedication to perfecting their craft as they continue to push the envelope of creativity.”

    For more on the 19th Annual Communicator Awards - http://www.communicatorawards.com/winners/

     

  • May 18, 2013

    Announcing AWS Management Pack for Microsoft System Center

    Microsoft System Center Operations Manager supports monitoring of multiple cloud environments, and this has been further extended with the release of the Amazon Web Services (AWS) Management Pack for Microsoft System Center. The AWS Management Pack enables you to view and monitor your AWS resources directly in the System Center Operations Manager console. This way, you can use a single, familiar console to monitor all your resources, whether they are on-premises or in the AWS cloud.

    The AWS Management Pack provides a consolidated view of AWS resources across regions and Availability Zones. It also has built-in integration with Amazon CloudWatch so that the metrics and alarms defined in Amazon CloudWatch surface as performance counters and alerts in Operations Manager. With the AWS Management Pack, you can gain a deep insight into the health and performance of your applications running within the Amazon EC2 instances. The diagram view generated by the management pack makes it easy to traverse between the application and the infrastructure hosting it, with just a few clicks.

    You can monitor following AWS resources using the AWS Management Pack:

    • Amazon EC2 instances (Microsoft Windows and Linux)
    • Amazon Elastic Block Store (EBS) volumes
    • Elastic Load Balancing
    • AWS CloudFormation stacks
    • AWS Beanstalk applications

    All the default Amazon CloudWatch metrics for these resources—and any Amazon CloudWatch alarms associated with them—are surfaced as performance counters and alerts in Operations Manager.

    The AWS Management Pack is available for System Center 2012 and 2007 R2. To learn more you can watch the Amazon produced video on the management pack or download the AWS Management Pack.

  • May 9, 2013

    Remote Connectivity Analyzer Helps Track Slow E-mail

    Over the last few months, Microsoft has been busy with the Remote Connectivity Analyzer (RCA). Originally launched as the Exchange Remote Connectivity Analyzer, it gave administrators an easy way to check and make sure that remote access to Exchange was working properly. The tools have evolved over time to become a one stop shop for connectivity testing for key Microsoft communications applications. You can now test connectivity to Microsoft Exchange, Office 365, Lync and Lync Online all from a single portal.

    Recently Microsoft has expanded the analyzer toolkit to include tools for testing connectivity for internal access as well. Back in late March, Microsoft released the Microsoft Connectivity Analyzer which gave end-users and admins the ability to run many of the RCA test from their own desktop. This became an invaluable tool when troubleshooting a problem when you are not sure if it is desktop related. They also released the Lync Connectivity Analyzer to help test and diagnose Lync connectivity issues from the desktop complete with full logging cpabilities on the output. It’s also a great tool to make sure you’ve set everything up right to support the new Windows RT and Lync 2013 Mobile Apps.

    This week Microsoft put out a fantastic update to the tool to give administrators yet another weapon in their troubleshooting arsenal. The Message Analyzer is an SMTP Header analysis tool that can help to dissect message headers (from Outlook or any other email) so you can find out where the mail came from, what hops it took to get there (very useful when someone complains of email delays) and also any anti-spam or other x-header rules that may have affected delivery of the information all in a format that is much more readable than the plain text your normally see in a header.

    If you haven’t used the Remote Connectivity Analyzer recently, go check it out. It’s my first tool of choice whenever connectivity problems are reported and it should be yours too!

  • May 8, 2013

    Publishing safe, sanitized content using Microsoft Word and Adobe PDF documents

    It is widely known that Microsoft Word (and specifically the Word .doc file format) is used for the preparation of documents, reports, notes and other formal and informal materials across the commercial, governmental and public sectors.

    A second format, the PDF document, is also used pervasively in the business and government sectors to exchange files, publish to the web and for interactive content such as forms and multimedia.

    The standard procedure for many years, across sectors, has been to convert Word documents to PDF format not only for ease of distribution, but to disallow continued editing while adding a layer of additional security to the document.

    In many cases this process is enough; however, simply saving these documents to PDF format does not strictly guarantee security. Additional steps are required for the complete removal of possible sensitive, redacted or hidden information.

    From the National Security Agency:

    “Despite this common use of PDF documents, users who distribute these files often underestimate the possibility that they might contain hidden data or metadata. This document identifies the risks that can be associated with PDF documents and gives guidance that can help users reduce the unintentional release of sensitive information.“

    Word Document Sanitization Basic Procedure

    1. Create a copy of the document
    2. Turn off reviewing features and remove associated data
    3. Review and delete sensitive content
    4. Check redacted content and run document inspector
    5. Verify Acrobat conversion settings and convert

    See Page 12 of this recommended NSA Document for detailed procedures

    PDF Sanitization Basic Procedure

    1. Sanitize the Source File
    2. Configure Security Settings
    3. Run Preflight
    4. Run the PDF Optimizer
    5. Run the Examine Document Utility

    See Page 19 of this recommended NSA Document for detailed procedures

    Additional Documentation

    Redacting with Confidence: How to Safely Publish Sanitized Reports Converted From Word 2007 to PDF: http://www.fas.org/sgp/othergov/dod/nsa-redact.pdf

    Hidden Data and Metadata in Adobe PDF Files: Publication Risks and Countermeasures: http://www.nsa.gov/ia/_files/app/pdf_risks.pdf

    Remove tracked changes and comments from a document: http://office.microsoft.com/en-us/word-help/remove-tracked-changes-and-comme%20nts-from-a-document-HA101822263.aspx

    Tools

    Doc Scrubber: http://download.cnet.com/Doc-Scrubber/3000-2079_4-12599674.html

     

    When releasing information to the public in a Word or PDF document, make sure that only the intended content is presented. Word’s ‘Inspect Document’ feature and Adobe’s ‘Examine Document’ tool supplement document review, but are not intended to wholly replace the redaction process. The sanitation processes outlined in this post reduce the likelihood of including hidden data, metadata and redacted content in the final Word or PDF file.