Keyword Tracking with Caphyon’s Advanced Web Ranking

I’ve been honing my skills over the last few months with optimizing our e-commerce sites for search engines. Its been a long, arduous task. One that requires the utmost patience. When things go wrong, it seems like the world is ending. But, like one of my favorite books was famous for saying, Don’t Panic!

Over this period of time of getting up to speed on search engine optimization, I’ve been looking for a tool to help me track our rankings for keywords we’re going after. I had been using this huge Excel spreadsheet to track rankings on a bi-weekly basis, but it was a pain in the butt to update. I found some online tools that would let you track a few keywords, but nothing all that awesome. That, and I really wanted a desktop application for this. Don’t ask me why.

So, I was reading YOUMoz the other day and came across this post about the author’s troubles with MSN Live Search. She mentioned she used this tool to track keyword positioning, so I decided it was worth a look. I downloaded the 30 day trial version and installed it on my MacBook Pro.

Setting it all up with keywords I was interested in for my site was a little time consuming, but I didn’t mind that. I gave it a run and I got a great breakdown of keywords we ranked well for and keywords we didn’t. This is cool as now I can see where I need to put some work in. But that wasn’t the coolest part. The following day, I ran it again. Now I could compare my results from today to the previous day. Kick. Ass. Even better, it stores your results for each day, so you could see how you did over the course of a month or a quarter or a year.

I find this to be a huge time saver because I don’t have to manage my rankings. All I have to do is add new keywords I want to track. When I want to update my positioning results, I just click a “Play” button and away the tool goes. I highly recommend this application to anyone tracking their SERPs in any of the major search engines.

Has Google Tweaked Their Search Algorithm?

In mid to late November, I noticed traffic to one of my e-commerce sites dropped dramatically. Some quick research in Google Analytics turned up that our Google organic traffic dropped off significantly. This came as a huge shock to us because we were cruising along really, really well for some important key terms that drive quality traffic to our sites. This equates to lost dollars since we aren’t getting the visits that could turn into conversions.

After some research and conversations with other web developers I know, I came to the conclusion that nothing I had really done to our site would have caused the drop off in traffic. What’s worse is that we seemed to have dropped out of the Top 100 for key terms that we had previously ranked high in the Top 10 for. Really. Big. Problem.

So what happened? My guess is Google tweaked up their search algorithm a little. My reasoning came from the Google tools that I use on a daily basis to manage my sites. Take a look first at the image below. What it shows is our Google organic traffic from November 1, 2008 through December 4, 2008.

What you can see here is that we were flying at something around 1,800 to 2,300 visits a day. Good stuff. Then around November 11, things dropped down sharply. I kind of panicked here, but didn’t change the site up any. Then on the 14th, we were back up again. Way up. For the next 5 days, we were up around 2,600 visits a day, even peaking over 2,700 one day. Then on the 15th, things fell apart again. We were down slightly. Ok, only one day. Don’t panic.

Then I did panic. On November 20th, we were down to just over 1,000 visits. Ouch. Over the next 2 weeks, we went as low as around 850 visits. This was a problem. What happened? I didn’t change up the site in a way that should have affected our rankings from all the research I had done on my own and with other developers. So what gives? Take a look at the image below, which shows Google crawl stats

See the huge spike around the second week in November? That’s about when things started to hit the proverbial fan. This spike looks like a deep crawl of our site by Google. I’ve never seen Google crawl our site this much all at one point. To me, this indicates changes were afoot over at Google. What changes, who knows. Only Google does. What I do hope is that our site comes back up in the rankings as its a well built site, adhering to Google’s guide for building quality sites.

The bright spot here is that we’re starting to come back for some of our key terms and our organic traffic from Google is looking better. At the same time, our traffic from MSN and Yahoo has continued to get better, so this is another indication that Google changed something up. Also, using Google Webmaster Tools and Google Analytics can provide you with great insight into what Google sees of your web site as well as how people find and use your site. I highly recommend using both if you don’t already.

Google AdWords Now Has Device Platform Setting

I just set up a couple of new campaigns in Google AdWords this afternoon and, for the first time, noticed that you can now select a device platform for your ad campaigns. The choices are desktop or laptop computers, or mobile devices that support a full web browser. These mobile devices are devices like Apple’s iPhone. Click the screen shot below to see these settings:

This is definitely interesting because there are surely some instances where you’d want to restrict your ads from showing up on a mobile device. I’m considering turning it off for our campaigns because we run an e-commerce site and its really unclear at this point if people are really into shopping online using their mobile devices. I have no doubt that this will become widespread behavior in the future, but with mobile devices with full browser capability just coming to market, I have a feeling this is a small niche crowd that would shop via their mobile device.

Speeding Up Your Web Site with YSlow for Firebug

I’m always looking for an edge over our competitors to make using our e-commerce sites better from a usability standpoint. I think one of the easiest things to make the experience better is to make sure your site is responsive when people visit it, no matter what kind of connection they have or what they have for a computer. I decided to do some research on how to improve our sites download times and came across YSlow for Firebug.

YSlow is a Firefox extension that plugs into the Firebug extension. Any developer that doesn’t use Firebug is really missing out. So if you don’t have it, get it. Anyway, you can install YSlow right into Firefox and get access it through Firebug.

Upon analyzing our site the first time, we received a score of 42 from YSlow, which was an F. Ouch. That didn’t make me feel all that great about our site. You can see screen shots of our initial scores here and here. We scored really low for all but four of the thirteen performance criteria. I decided to attack the easiest tasks to complete first. This was Minify JS, Add an Expires header, and Gzip components.

I minified our javascript files using a utility called JSMin. It basically removes all whitespace and line returns from your file. It doesn’t compress the code all the way, but I wanted it to remain a little readable if I needed to look at the code on the live search.

Next, I wanted to handle adding an expires header. Since we use ASP.NET and C# for our web application, I was able to write a HttpHandler to do this for me. What was even better was I was able to handle the expires header and another issue, ETags configuration, all in the same snippet of code. For each request, our HttpHandler adds an empty ETag and an Expires Header of 3 days in the future. Both of these are used to determine when a cached copy of a web page needs to be refreshed. The ETag tells the browser that the version it sees now is different from the original. The Expires header obviously sets the expiration on the page.

Lastly, I wanted to GZip all of our components. This just required configuration of our IIS Server. You can also do this directly within your .NET application, but I didn’t see the value in this as IIS could do it for us.

After implementing these changes and a few other mundane ones, I ran YSlow again. Low and behold, we’d gone from a score of 42 to a score of 76. Not bad! We’re now scoring a “High C” according to YSlow. From a usability standpoint, I could definitely tell that the site responded much faster than it did when we were scoring a 42. For those of you that would like to see screen shots of the stats, you can see them here and here. Looking at the stats, you can see that we cut down the data downloaded from 413.1k to 234k, which looks like a huge improvement.

I strongly recommend anyone who’s developing web applications to take a look at YSlow. You might not be able to implement changes for all of the points it says you’re not doing well for, but even 2 or 3 changes should net you some great improvements in the performance of your site.

Google’s New Promote and Remove Feature

I’m not sure how many people have seen those two new icons next to Google search results, and . Apparently its a new way to say, “Yeah, this result was useful to me or No, its not.”. I’m guessing Google wants to see what actual human beings think of the results that they’re displaying. Obviously the Google engine is only an engine, based on a machine and code, so it’s not 100% perfect. Grabbing the human factor of a useful or not so useful search result is something they’d definitely want to know.

But that got me to thinking, how useful is it to them? I’d think eventually people are going to try and abuse this in order to put down results for their competitors and promote their own. Granted, that will all depend on how much Google weighs the information they get and what they do with it. Perhaps they just want to gather it and do nothing with it to affect how results are actually displayed. Or maybe they do. Who knows. I haven’t seen much information on it other than people discussing what they’re seeing in different forums, etc.

Whatever the reason is and what it means for SEO down the road will be interesting to see. I for one will be keeping a close eye on anything I see about this.

Update:  So apparently this is all part of Google’s SearchWiki. When you’re logged into Google Accounts, you’ll see these nifty buttons. It allows you to customize your own search results. This doesn’t mean the index itself gets affected. However, I’d still bet my lunch on the fact that Google eventually uses these human “modded” search results in weighing its own search results. Hopefully though this keeps abuse to a minimum.

Whats Up with Irregular and Inconsistent Google Search Results?

I’ve noticed some wackiness (at least what I consider wacky) with Google search results lately. We’ve been working slowly but surely on improving our rankings for one of our sites. We haven’t been making any sweeping changes, but instead making small tweaks here and there to title tags, meta descriptions, adding some relevant content to our pages, and getting our pages linked to from other relevant sites.

What I’ve noticed over the last week though, is that a couple times a week, we’ll drop off the face of Google search results for one of our top terms. Its not like we’re falling from #3 to #10 or from Page 1 to Page 2, but falling off the results map altogether. What’s even weirder is that a couple of days later, we’re back up to where we were before the “hiccup”.

We’ve also noticed that search results at any given time of the day can vary greatly. We can show up ranked #2 or #3 for a top keyterm, then later on in the day, #9. Or, perform one search and we’re #3 and immediately search again and we’re #6. Sometimes, I can search for a phrase and get one ranking while a co-worker can do the same search and get a completely different ranking. I’ve been trying to figure out why this happens, but I keep coming up empty.

I never expect to keep rankings forever as the web changes almost constantly, but you’d think that you’d get at least some consistency in search results. Especially for a site that is fairly well built and adheres to what Google calls best practices. But what I am really confused by is the wholesale change to our rankings for certain keywords in one fell swoop. I’d expect to see rankings slip and slip, not disappear all together.

It could very well be that all of this is just a lack of a complete understanding on how Google search results and rankings work. I’m not a complete newb to SEO, but I’m not an expert either. If anyone can enlighten and educate me on what I’m seeing in our search results, I’d be certainly grateful.

ASP.Net HyperLink Control And Html Encoded Ampersand’s

I just ran into some odd behavior with the HyperLink control ASP.Net. Per the W3C, you’re supposed to HtmlEncode ampersands, using & instead of ‘&’ when building URLs in your HTML code. The reason is that the ‘&’ is assumed to be an entity reference. What’s nice is most web browsers can recover from this type of error, but if you want your site to pass validation, you need to use & instead.

So I hooked up all of our URLs to use this method, especially when we wrote out URLs in our C# classes. What I found odd was if I did this using a HyperLink control instead of an HtmlAnchor control, .NET would write the & out in the URL instead of using ‘&’. Naturally this broke our site as query string references weren’t parsed properly. The fix was to use an HtmlAnchor instead.

I’m not really sure why .NET does this or if there’s another workaround for it, but this solution worked for me. I’d be curious to know the reason behind the behavior though.

Google’s Clean Energy By 2030 Initiative

Right on the heals of my post about the wind turbine here in Worcester, MA, I just read a great article on Google’s Philanthropic Blog on what they’re working on to get our nation moving faster to renewable and cleaner energy. It’s great to see such a large company that is a household name take the initiative in helping us get to where we need to be. A lot of corporate and personal responsibility needs to be had in the near future to get us off of fossil fuels which dominate our fragile economy. For the most part, I think Google is on the right path…

Wind Turbine Goes Online in Worcester, MA

This probably isn’t that interesting to most other people in the world, but it is to me since I live in Worcster, MA and I love hearing about clean energy making progress. It’s something that just makes sense and we’d be foolish not to embrace it. Anyway, Holy Name Central Catholic Junior Senior High School did the research and actually followed through on building a 262 foot high wind turbine on their campus here in the city. You can just barely see it pop above the tree line as you drive down 290 west bound through the city. I saw it the other day and couldn’t believe what I was seeing right in my own back yard. I’d be lying if I didn’t say I was excited to see it.

What I find particularly exciting about this is that Worcester is known for its weather. We get all sorts of weather patterns coming through the city because of its central location in Massachusetts. We always seem to at least have a slight breeze all times of the year. It gets especially windy in the Spring, Fall, and Winter. So this type of renewable energy in our city makes total sense. I’d like to see more of these pop up around the city. They don’t look ugly if you ask me, and with the abundant wind resource we have here, it makes perfect sense to help power the city.

Building Ecometry Shipping Stations Redux

I wrote about building an Ecometry Shipping Station on your own over a year ago. A few people have tried building one on their own using this guide, which is great. So I decided when I was going to build two more when we integrated UPS and were given some new Dell computers as part of a UPS subsidy (which was really cool), that I should share my experience again.

Everything worked pretty straightforward like last time, save for that the new computers don’t have PS2 ports, just USB. So our older scanners no longer work with new hardware. The configuration is as follows:

  • Dell Optiplex 740 Desktop
  • Zebra S4M Direct Thermal Printer
  • Mettler Toledo PS60 Scale
  • Symbol LS2208 Barcode Scanner

I still had to change the settings on the COM1 port to work with the scale. The settings can be found in my original post here. I also had to set the scale’s protocol to Mettler Toledo, which you can easily do following the instructions that come on the CD with the scale. Thanks to Chuck on the Ecometry Google Group for that tip. You’ll also want to be sure the baud rate and stop bits settings on the scale match up with what you set on the COM port.

The Zebra S4M printer will work just fine with UPS provided labels. If you don’t have those, get direct thermal labels. You don’t need a ribbon (and the printer isn’t configured for one from UPS anyway). Ecometry will tell you that all that works is the Z4M printers, but the S4M printer will work just fine. This is great because it costs about half as much as a Z4M.

And remember, there are no PS2 ports on these newer computer so there’s no support for older scanners, such as the PSC Powerscan PSSR-0000 or PSSR-1000. These just aren’t compatible with USB. You could perhaps get this to work with a PCI add in card such as this one and some AT to PS2 converters, but I didn’t want to spend a bunch of extra money to just hack the thing together. It seemed to be a better idea to just get all new hardware for these.

We’ve been using these new stations for a few days now and they’re working great. Feel free to drop me a line about building these. You can definitely save yourself a bunch of money building these on your own instead of going through Ecometry’s provider, Agilysys.