Replacing The Fans In My 17″ MacBook Pro

I’ve had my 17″ MacBook Pro Core Duo for 2 years now and I don’t have a single bad thing to say about it. However, probably starting 6 months ago, the left fan started to make a lot of noise. It sounded like the bearings in the fan itself were going. What was worse was that even though the fan speed seemed to be set right compared to the right fan, the MacBook got really, really hot on the left side. I put up for it for quite a while, but I finally gave in and decided I needed to do something about it.

Generally I’m not sketched out by taking a computer apart, but my MacBook was a little different. I didn’t want to ruin the case or anything like that and I certainly didn’t want to fry and of the tiny components inside it. This computer is basically my life line to everything I do for development, both freelance and full time so I couldn’t afford to kill it as I don’t have a reliable backup for it (something I’m looking at resolving currently). So, even with the slight fear of messing up my MacBook, I trudged along.

I bought my new fans for $39.95 each plus S&H over at ifixit.com. What was even better is they have two pretty good articles on replacing the fans in my MacBook. The left fan instructions are here and the right fan instructions are here. I was able to follow the right fan instructions to a T. The left fan ones were a little different for me. First, I had to remove the Airport card. That wasn’t too difficult as its held down by a torx screw. Also, the left speaker was actually screwed down in my MacBook. It didn’t appear to be in the one used in the guide as the instructions said to just lift it up. I couldn’t do that and was glad I saw the screw and didn’t force anything (that’d have been bad!).

In all, it took about an hour to get the MacBook apart, the fans replaced, and the case put back together. I definitely took my time as I didn’t want to mess up and was very careful to not touch anything I didn’t have to inside the computer. I’ve included some interesting photos I took as I did my repair, which are below. The one thing that I did, that I highly recommend, is that as you take the screws out of the case and other parts of the MacBook, that you put them on a white piece of paper and label them. Or, use some plastic baggies and do the same. A lot of the screws look the same and you definitely want them to go back in the proper place.

If you have fan trouble in your MacBook, I definitely think anyone that has some technical abilities can do this repair themselves instead of paying Apple or another Apple repair shop to do it for them. I’d imagine you’d pay over $100 plus the cost of parts to have them do it for you.

MacBook 17" Top Case Off MacBook Pro 17" Top Case Off Overhead View

MacBook Pro 17" Labeled Parts MacBook Pro 17" New Fans

MacBook Pro 17" Airport Card MacBook Pro 17" Airport Card Removed

MacBook Pro 17" Right Fan Removed MacBook Pro 17" Old Fans Removed

Apple MacBook Pro 17″ Core Duo “Swollen Battery”

I’ve been having some issues with power to my Apple MacBook Pro 17″ Core Duo laptop recently. There have been issues with the fans for quite some time, especially the left one, so I thought that might have been part of the problem. The left fan was especially crunchy. So I ordered two new fans for it and went through the replacement process on my own (details to come in an upcoming post). That didn’t do the trick.

Later on that night, my MacBook shut itself off again. Really annoying. I flipped it over to look at the battery meter and noticed that the edge of the battery was above the edge of the battery compartment. I popped it out and low and behold, it was starting to swell. You could definitely feel some of the cells were expanding and thus pushing the “aluminum” plate off the battery. Having seen posts about this same problem online before, I did some hunting and found this article over at Apple.

I had already installed the Apple Battery Update 1.2, so I decided to roll the dice at my local Apple Store and see if I could get it replaced. My MacBook is out of warranty, so I was a little skeptical, but figured if it saved me the $129 for a new battery, it was worth a shot.

MacBook Pro Swollen Battery

So off to the Apple Store at the Natick Collection in Natick, MA I went. The Mac Genius I spoke to, Ray, was very very nice and as soon as I showed him the battery he said he’d be more than happy to provide me a replacement at no charge even though the MacBook is out of warranty. Even more, this battery has its own new 1 year warranty, so I should be set if I have any more problems.

MacBook Pro Swollen Battery

I think the model I had was A5389, serial number 6N7161H9WX4A. I’m not 100% sure when it was manufactured, but the date on it was 2006. Anyway, if you’re having similar issues with your MacBook, I definitely suggest that you go to the Apple Store and have them take a look. You might save yourself $129 in the process.

Migrating a pMachine Website to WordPress

Anyone who has used or is still using pMachine for their website and forum knows that its outdated and a pain to use. Today, there are tons of alternatives to content management, blog management, and forum management that you can find the best tool that works for you. A few years back, EllisLab replaced pMachine with ExpressionEngine, which from the little that I’ve used it and heard about others using it, is way better than pMachine. But, being a WordPress fan, I wanted to use that instead of ExpressionEngine. That and WordPress is free, which is a plus. The WordPress developers have also come up with a great forum engine, called bbPress, so I decided that would be a great replacement for the pMachine forum.

So now I know what software I want to use, but the first thing I had to figure out was how easy would it be to migrate our pMachine data over to WordPress. No good importer for pMachine comes with WordPress, so out to my good friend Google I went for answers. Pretty quickly I found two articles about migrating pMachine to WordPress 1.5 here with some actual PHP code to import the pMachine blog here. So, I was off to a good start.

Since bbPress can integrate directly with WordPress, I needed to use WordPress version 2.5 so that the cookies between the two applications would work. Apparently WordPress changed their cookie implementation in newer versions, so until bbPress gets an update, we’re stuck with the old WordPress version. There aren’t any known security risks with WordPress 2.5, so we’ll be ok. I installed both applications and then went to importing my pMachine data.

Based on the PHP scripts I downloaded from the article mentioned above, I ended up writing a bunch of my own PHP scripts to import all of our user data, post data, comment data, and forum data into WordPress 2.5 and bbPress. I didn’t want to install WordPress 1.5 first, so I took the code snippets I needed and went to work.

It took the better part of a week to import and test everything out and make sure all of the proper relationships between users, posts, and comments were set up properly. I’m currently packaging the PHP scripts up to release to the world under some public license so everyone else can who was in my position will have a solution. I also want to get the final version of GhostDroppings up and running and make sure my users are happy with the migrated data, but so far everything looks good. So check back in a bit to download the code!

ASP.NET Site whitehouse.gov Reviewed

So I was over on Slashdot last night looking for something interesting to read and ran across this tidbit about the new whitehouse.gov site that runs on ASP.NET. Honestly I think the only reason this got mentioned on Slashdot is that yesterday was Inauguration Day. Any other day and it probably would have fallen through the cracks. Anyway, I decided to take a look at the pointers that the author brought up to see if there was something I could learn. Most of the improvements I already knew about, but there were a couple that were new to me. Two that I’d like to consider implementing in my sites are:

  • Remove X-Aspnet-Version: header (remove 30 bytes per request)
  • Use compressed JQuery from Google’s servers (lower latency and improve performance)

The one issue with using JQuery from Google is your tied to whatever they’re using. If a newer version comes out and you want to use it but Google doesn’t, you couldn’t do this. Luckily I use JQuery 1.2.6, so this isn’t an issue for me.

As for the author’s review, I thought it was pretty good. A real world example is always a great illustration of what to do and what not to do. I’d have liked to see him make a suggestion on how to fix one of the issues he found, the ViewState issue, which would have been useful for other developers making the same mistake.

ViewState is still necessary for an ASP.NET application, however you don’t have to pass the entire ViewState back to the client. Its a waste of bandwidth (and can ’cause some nasty Base64 exceptions). One solution I use is to store the ViewState on the file system and only pass the file name back to the client for reference later on. It takes up a lot less bandwidth than a potentially huge ViewState. Other solutions are storing it in memory, the database, or by some other means. We clean out our old ViewState’s every 3-4 hours as many of them aren’t needed after that point (I’m thinking this might make a good article in the future).

Another example that kind of irked me was the site’s use of meta keywords. Uhm, yeah, this might not be as relevant anymore for Search Engine Optimization, but its still not a bad thing to have it in there. Just keep it to 200-300 characters. Nothing too crazy.

One last thing that he pointed out that I just didn’t agree with was that the server was running IIS6.0. Now, correct me if I’m wrong, but IIS7.0 is only supported on Windows Server 2008, right? Well, IT departments have budgets and all, so maybe a Windows Server 2008 license or the hardware to install it to wasn’t available? I know in my case, the budget doesn’t always allow for the latest and greatest, even if I want to use it. So to knock the development team for using IIS6.0 seems a little over the top if you ask me.

This entire site could definitely use some improvements, which the article nicely points out. To go one step further, I suggest any web developer install YSlow for Firebug for the Firefox browser. This came in handy for me when I was trying to optimize my sites. The whitehouse.gov site’s YSlow score is a 51 (an F), which is horrible. When I started using YSlow, I noticed some of my sites had a similar score, to which I was appalled and went to fixing pronto. By implementing some of the changes that I could as suggested by YSlow, I got us up to a 78, which is a high C. Some changes you can’t make (like Javascript at the bottom of pages and using a Content Delivery Network) due to how your application works and changing them just to make a minor improvement is more trouble than its worth. However, there isn’t any excuse to have an ASP.NET site that scores so low. Those folks over at whitehouse.gov definitely need to clean things up!

URL Rewriting in ASP.NET – ISAPI Rewrite vs. UrlRewritingNet

Seeing your web site rank well in the major search engines (Google, Yahoo, MSN) is something that every web developer strives for. Making sure your web site’s pages are search engine friendly is a huge part of that effort. With ASP.NET, pages typically end in the .aspx extension. While this isn’t bad for SEO, most ASP.NET pages are also dynamic. So having a dynamic page that is rendered based on a bunch of query string variables doesn’t get you anywhere with SEO, especially if you use a lot of them. So for a while now, we’ve used ISAPI Rewrite to rewrite your dynamic pages into something that is a lot more user friendly. This ISAPI extension isn’t free though. It costs $99 to purchase a license. Not that bad, right?

Well recently, I discovered an Open Source solution called UrlRewritingNet. It was originally developed in 2006 for the 2.0 framework, however the developers claim it will work all the way up to the 3.5 framework. I’m not sure how much further development is being done though, as the last release was in August of 2006. It is Open Source, so theoretically you can download the source and make modifications yoruself if you need to.

UrlRewritingNet integrates directly into your ASP.NET web application as a HttpModule. With each incoming request, this module is called to see if the URL requested is a rewritten URL. This isn’t too different from ISAPI Rewrite except that the rewrite for ISAPI Rewrite is handled higher up the stack by IIS itself and not the web application. Using one solution over the other shouldn’t be much different from a performance standpoint, but tests would need run to prove it. That’s a task for another time however.

The one drawback, for me anyway, was that I did see that UrlRewritingNet requires all of the rewrites to be defined in the Web.config file. I’m not a huge fan of that. Web.config gets cluttered up enough as it is, so the more rewrites you need the bigger this file is going to get. With ISAPI Rewrite, all of the rewrites are stored in a separate file called httpd.ini, much like rewrites in Apache. On the plus side, it looks like you can extend UrlRewritingNet by developing your own rewrite rule providers. So if you need some functionality that isn’t provided, you can hook what you need up yourself.

UrlRewritingNet looks pretty promising and I hope I have some time to check it out and see if it is a good substitute for ISAPI Rewrite.

Linksys WRT54G as a Wireless Repeater Using dd-wrt

Recently, I discovered I had some weak wireless signal in certain areas of my home. My first instinct was check out BestBuy or some other online shop for a Wireless Repeater only to find that I’d be spending a minimum of $90 to get the device. Yuck. Double yuck. I wanted a cheaper solution. Yeah, I might have found something cheaper on eBay, but I was also impatient and didn’t want to wait.

I had this Linksys wireless router kicking around and wondered if someone knew how to turn it into a Wireless Repeater. Afterall, isn’t the technology used to be a wireless router or repeater basically the same? After a quick Google search, I found out there was a solution. Using the dd-wrt linux based firmware. I was familiar with this already as I had used it before on another router to upgrade it after I had read this article on turning your $60 router into a $600 router. So I decided to give it a shot.

Pretty much all of the instructions I needed I found here. Basically you have to:

  • Disable the firewall
  • Choose the network you want to repeat
  • Create a virtual network to connect to

All said and done it took me about 20 minutes to set it all up, mostly because I didn’t want to brick my router, even if it only cost $50 and I wasn’t using it anymore. I’m not a fan of breaking stuff. Anyway, now I have great signal strength no matter where I am in my house, which is key. If you’re having a similar problem, I highly recommend trying this out, especially if you have the hardware lying around unused.

Minimizing Downtime When Deploying ASP.NET Web Applications

One of the annoying things I find with managing our ASP.NET web applications is that I need to “shut down” our sites when deploying a new version of our assemblies. Its not that I’m worried about visitors seeing our maintenance page for 20-30 minutes, but as search engines spider our site during this down time, they can’t find pages that they knew about before. This is especially annoying when analyzing my sites in Google’s Webmaster Tools and see that I have a bunch of HTTP errors because Google couldn’t find a page because the site was unavailable. Since Google puts a high regard on site availability and quality when determining rankings, I’d like to avoid this.

Deploying the site is actually very simple. We use the XCOPY method of pushing up web forms, controls, and assemblies. But if you just start overwriting files in the live site, users get errors or inconsistent pages. And, if any database changes need to be made, code could not function properly before updating the site. Any of these problems would affect my Google issue I mentioned above as well. Not only that, but any developer worth his/her paycheck tests their changes in the live environment anyway. So just tossing up the new version is no good.

I’ve been considering a few different solutions to this, but I haven’t come up with something that solves the issue completely. At some point, I still have to take down the site.

One solution that I thought of was that I could set up a staging server to deploy my changes to and then run my tests there. Once I’m satisfied with the results, I could push it to my live site, minimizing the amount of down time. I figure the max amount of downtime using this approach would be 5-10 minutes depending on if I had to make any database changes. Not a perfect solution, but better than the 20-30 minutes I’m experiencing now.

Another solution I thought of was to set up a web farm. I could deploy the new version to one web server, make sure everything is good to go, then deploy it to the second web server. Users could still visit the site because at least one server in the farm could handle the incoming request. But this wouldn’t work great if database changes needed to be made. The site itself would still have to come down.

So right now, solution #1 appears to be the best approach and easiest to implement. Maybe I’m making more of a big deal of this than I need to, but I think everyone wants to minimize the downtime of their site. The one reason I’m holding back on changing my approach is I don’t know how much Google or other search engines weigh not being able to access a site for a short period of time. Regardless, I’m curious what other solutions developers and web teams use to deploy their ASP.NET applications to minimize site downtime. I’m positive there is a solution to my problem, but I just haven’t thought of it yet. If anyone has something that works for them, please chime in here!

Nextopia Search Integration

We’re on our third search integration in the last few years. We used to use SLI Systems for search. They have a phenomenal service for search, but its a little more expensive than our small company can afford. So, last year, we decided to switch to the Google Mini. We had heard some good feedback from some other sites that had implemented it and decided to go for it. The cost for us was only $3,000 for the Google Mini device, the $200 a month co-location fee we pay, and some development time.

In the end, the Google Mini just didn’t give us consistent results for the how we were using search. We were able to implement a pretty cool solution that let you refine results by price and category, something I hadn’t seen anywhere else. But at the end of the day, not being able to get a result for something as simple as an item number was just too frustrating. I tried a lot of different techniques with the Mini to get the results to come back properly, but at the end of the day, it just didn’t cut it. I’m not sure if its because of the fact that it depended on spidering the site or not that gave us issues, but we had to start looking at other solutions.

My boss sent me an article about three different search solutions and compared them for readers. After reading, we decided to give Nextopia a shot. SLI was one of the other choices, and as much as I liked their service, we just still couldn’t afford it. The reasoning for trying Nextopia was that it provided similar features that SLI had and the price was way more appealing. You can get search for as little $995 a year. We’re paying a bit more than that for our three sites that get decent traffic, but its still affordable for our small company.

Nextopia provides you with a search page and template, but it didn’t fit in with the look and feel of our site. Since I wasn’t happy with the code they wrote for it either, it was more work to modify their version than to roll my own. Luckily, the code we had for our SLI and Google Mini implementations was written well enough that I was able to abstract that code even more (had to refactor a bunch of it) and roll a Nextopia version of our pre-existing search pages. What’s nice is now we have a C# library with 3 search integrations and the ability to roll new ones at any time we want.

The integration process took about a days worth of development and testing, which isn’t bad for integrating a 3rd party search solution. Granted, the refactoring of our search library took more time, but the actual Nextopia integration took a day. We’re really happy with the results it can provide and how quickly it can provide them. This definitely gives our users the robust search solution they need to find out products reliably and quickly. I’d say the only downside to the integration though is I can’t automatically send them a feed of our products. I have to update it manually on a regular basis, which is kind of annoying but not the end of the world. Anyway, if you’re looking for an e-commerce search solution, I highly recommend Nextopia.

Happy New Year Wishes For Everyone

Happy New Year wishes to all my friends and family out there. Same to the few people who actually read my blog. Thanks for coming by. The posts were a little few and far between in 2008. Here’s hoping that 2009 brings me a little more time to blog about things I’ve learned and things that I’m interested in. Stay tuned and I’ll see everyone in 2009!

Does Texeira Equal A Championship For the Yankees?

So I guess its official that the Mark Texeira sweepstakes are over. The NY Yankees are getting him for 8 years and $180 million dollars, giving them the four highest paid players in all of baseball. Rumors had Texeira landing with my Boston Red Sox, but you can never count the Yankees out of any deal that means they can spend a ton of money. Anyway, congrats, I guess, to all you Yankees fans out there. But remember this, the Sox still have more championships this decade than you do and just ’cause you have spent all this money doesn’t mean you can beat out the Sox or even the Tampa Bay Rays for the AL East.

Oh yeah, I almost forgot. Do you think the Yankees will ever learn that they can’t just buy a championship? I’m not sure they ever will.