Archive for the ‘Resources’ Category

A couple Google Webmaster Tools tips and tricks

Wednesday, March 25th, 2015

Too few webmasters move beyond Google Analytics, which is a pity. Google Webmaster Tools is all you really need to become a Google-whisperer. Why spend all this time and money on MOZ and Woorank when you could just hear it directly from the Horse’s mouth.

I have been using it a lot lately since I have been doing a lot of SEO strategy work and I have really had some serious fun working through some of the tools that Google basically begs you to adopt in order to make things super-easier for them — and therefore better for you, the site owner or manager.

This week, we’ll discuss International Targeting and HTML Improvements.

Location, location, location (International Targeting)

Be realistic. I know that the Internet is global and that there might well be a distant client on foreign shores but it’s really worth it to get as geographically specific as possible. Do it for Google.  There’s two ways to signify your language and location to Google and they are through the use of Hreflang Tags and through Country Targeting.  That said, if you happen to purchase a clever domain like, which is an Icelandic top-level-domain, Google doesn’t give you much of a choice.


Whoops. I plan to shift my primary domain from to in order to be able to target the US in my Google Webmaster Tools. While I admire the fine, fair, high-cheekboned people of Reykjavik, I don’t know if they’re my ideal geographic target.

Google Just Wants to Help You Help It (HTML Improvements)

I feel like folks are spending time and money using both free and paid tools like MOZ and Woorank when Google is pretty explicit about what it wants and needs from you.

Google does care about Title and Description duplication and is happy to help you work through it.

Google is looking for Goldilocks titles and descriptions, searching for content that is not too short, too long, but just right.

Since Google explores your site using robots, spiders, and bots, their science is inexact.

Sometimes duplication isn’t duplication at all but has to do with Canonical URLs, aliases, and things such as trailing slashes and URL variations.

Strangely enough, Google’s a little odd. It’s 2015 and Google still requires us to submit both our plain domain as well as our URL with a WWW subdomain.  So, sometimes duplication isn’t simply duplicate content but issues with how you have your site domain setup.

I have three more subjects that I will discuss next week, Google Only Speaks Structured Data (Sitemap.xml)You Can Lead a Google to Structure (Data Highlighter), and Are You Fast Enough? (PageSpeed Insights).  I look forward to continuing on our exploration of Google Webmaster Tools next week.

Good luck! Go git ‘em, Tiger!


New Web Features Added To Visual Studio 2013

Wednesday, August 20th, 2014

Microsoft recently announced Update 3 for the RTM version of Visual Studio 2013, which includes some new features and fixes some bugs. (more…)

Most Popular WordPress Plug-ins

Wednesday, June 19th, 2013

One of the main reasons I love WordPress as a content management system and blogging platform is the availability of free plug-ins. While WordPress is an awesome platform to manage a web site right out of the box, plug-ins enhance user experience and site performance. Some plug-ins are even designed to help secure your WordPress application from hackers.

Integrate Google+ Sign-In Into Your Google Drive Apps

Wednesday, March 6th, 2013

So, you’ve built a Google Drive app with the latest HMTL5 technologies. Now you want to integrate Google+ sign-in into said app to provide a secure experience for your users. The Google Developer team has your back with a new tutorial:

ScriptDb Gets A Crash Course Video From Google

Thursday, February 21st, 2013

Google has uploaded a new Google Apps Script “Crash Course” video to its Developers YouTube channel. The video takes a “deep dive” into ScriptDb (a JavaScript object store built into Apps Script), shows examples, and discusses best practices for organizing data.

Google’s New Structured Data Testing Tool!

Wednesday, September 26th, 2012

Google has launched a new rich snippet testing tool called the structured data testing tool. It is much more convenient for webmasters to make structured data enabled sites. Now the way Google will display rich snippets in the testing tool will be much better so as to match how they will show up in search results. (more…)

How Can I Embed Search Results On My Own Page?

Wednesday, March 30th, 2011

I’ve read and enjoyed your articles on adding Twitter / Google search boxes to a website. I am trying to figure out a way to have one search box on my site that displays results (in a multi-column horizontal table) for multiple sources, particularly Twitter / Google / YouTube. Any help would be appreciated. Thanks!


Dos and Don’ts of Creating a First-Class Home Page

Wednesday, March 16th, 2011

If you didn’t already know, your home page is the most important page of your site. Why? Because it is the most indexed page by Google and is the gateway to the rest of your site. It is therefore important that your home page has the right content to encourage visitors to click through to other pages of your site.

Your home page must let your visitors know that they’re at the right place and that you offer what they want. To help you achieve this I wanted to share a recent video from Success Works that outlines the do and don’ts of creating content for your home page


Making Your PSD File Ready For HTML Slicing

Wednesday, February 2nd, 2011

If you’re designing a custom landing page, chances are that you’re using a PSD to HTML chop shop, like my friends at PSD to HTML/CSS. The problem is that things you may obviously require won’t be obvious to the coders. Here’s a checklist of things to specify in your order to minimize revisions and save time. (more…)

How To Get Your Mobile Site Indexed

Wednesday, December 8th, 2010

So you’ve built a mobile version of your website to target iPhone, Android, and Blackberry users. Congratulations! This can do wonders for creating increases in mobile traffic and subsequent conversions, especially when optimization includes local factors. The next step is to follow the appropriate steps to properly inform search engines of the existence and dimensions of the site:

  1. Create Webmaster accounts for Google, Yahoo!, and Bing
  2. Submit site URL to search engines
  3. Create and submit Mobile XML Sitemaps
  4. Create and submit Mobile Robots.txt files

If you don’t already have Webmaster Accounts set up for your regular site, do it now! They’re free and lend great insight into site performance and signify any indexation issues.

From the Webmaster Accounts for your desktop site, you can submit your mobile URL by selecting “add a site” as you would any other site. If the mobile site lives at another sub-domain ( or TLD (, the site will have to be verified. However, if the mobile site lives in a subfolder of the regular company site (, verification is unnecessary and the mobile folder will be segmented as a separate account.

XML Sitemaps are an important feature to implement on websites that indicate all of the pages very efficiently to search engines. Since mobile sitemaps use a unique mobile tag in the XML code, mobile and desktop XML sitemaps should be distinct from one another.

Sitemaps can be easily created using one of a slew of online automated XML sitemap generators. Although most online generators provide standard sitemaps free of charge, there is normally a fee for mobile XML sitemap creation.  We recommend the Sitemap Generator V3.0 from, which costs $19.99. This will allow you to set change frequency and priority levels of pages on your mobile site.

A reliable free alternative is to write the sitemap out by hand (realistic if there are a relatively few number of pages that need to be listed). is a great resource for understanding XML Sitemap Protocol.
Once the mobile sitemap is complete, upload to the root folder of your mobile site, test it with an online sitemap validator, and submit to each webmaster account. Read more about Google’s mobile XML sitemaps standards.

Although it was customary a couple of years ago to block the desktop site in the mobile robots.txt file and vice versa, this is no longer necessary since desktop URLs are normally showing up in search results. However, mobile robots.txt files should still be created to block any internal site search results and private data. These should also be submitted to each webmaster account.

As we move in the direction of web ubiquity, mobile search engines have become more inclined to list “desktop” URLs in search results, leaving it to sight owners to implement user agent detection in order to redirect mobile users accordingly. In my next article, I will discuss the ins and outs of user agent detection as it relates to mobile search. In the meantime, set up your webmaster accounts and submit your mobile site!