Nokia and TXT Spam

Last year I bought my 4th Nokia phone in a row, a N97 on contract from Optus. What a mistake that was. The phone would drop every second call and the user experience was less than I expected from Nokia. Telstra allow customers in the bush to test drive a handset for a few days or so to make sure it works where you need it, Optus on the other hand will sell you the handset but offer a "Coverage Satisfaction Guarantee". After about 2 weeks, I bailed out of the contract with Optus through the CSG, returning the handset to Optus and eventually not having to make any repayments on the handset.

When I first turned on the phone it gave me the option of activating My Nokia tips and special offers or something like that, I thought I would turn it on and see what tips and offers I could get. It turned out the tips weren't very useful and there were no offers, let alone anything I would call special. When I returned the phone I completely forgot about the My Nokia txts. I was to discover Nokia hadn't forgotten about me. About twice a month I continue to receive messages from My Nokia.

Back in September I got fed up with receiving the text messages, which contained no option for opting out, so I filed a complaint with ACMA. 6 weeks later I was advised by ACMA that they had contacted Nokia on my behalf and asked them to unsubscribe me. I assumed this would be the end of the matter. The messages continued, so I contacted ACMA again. Two weeks later I was told again that Nokia had been told by ACMA to unsubscribe me. The following day another text arrives. In early January I received the following response from ACMA.

Thank-you for your email, I have tried to unsubscribe you from receiving messages from My Nokia.

I received the following from a Nokia Email Support Executive on the 16 December 2009.

Thank you for contacting Nokia Careline. I have searched our records using the phone numbers that you have provided and I find that all of them are not present in our system which means that they have not contacted us even once. We will not be able to unsubscribe an account without the direct consent from the owner of the account. If the customer is having difficulty to unsubscribe from the service, they should contact us first so that we can assist them. There are several ways to unsubscribe from the My Nokia service.

These are as follows:

1. Using a PC, Login to My Nokia and click on Edit my Details

2. Using the phone, open the My Nokia icon and select "Unsubscribe"

3. Click the link at the bottom of the email message sent from My Nokia

Should you have any questions regarding our product or if we can be of any assistance, please feel free to contact one of our friendly Technical Support Executives on 1-300 366 733 between the hours of 8am and 8pm, AEST, seven days a week. For online assistance, please visit ‘ASK Nokia’ at our website

Have you attempted the above to unsubscribe from this service?

As I explained to ACMA, I have never installed any Nokia software on my PC, so option 1 is out. As I no longer have the handset, option 2 is out, it is txt not an email so option 3 wouldn't work, so I decided to call Nokia.

I called Nokia on the number listed above. After 15 minutes or so on hold, I got to speak to someone in a call centre on the sub continent. The line was appalling, to make matters worse the guy I was dealing with seemed to be the work experience kid on his first day. I would talk to him for a minute or 2, then be put on hold for 5 minutes or more while he put me "on hold for a minute while I check something". I don't think that there was a sentence I didn't have to repeat. In the end he terminated the call when I lost it after being asked to spell "Nokia" to him for a 3rd time. Almost an hour of my time wasted.

I called Nokia back. This time I was kept on hold again for around 20 minutes. As soon as the call was answered I demanded to speak to a supervisor. After further time on hold I got to speak to a supervisor. First he tried to tell me it was coming from Optus not Nokia and that I needed to contact them. Next I was told to use the My Nokia menu option, which I explained I didn't have. Finally he suggested that he could login to the My Nokia website and unsubscribe me - finally I was getting somewhere! Then I was asked for my password, I explained I didn't have one, "that's OK sir, you can go to the website and sign up for one". It was clear after almost another hour lost this was going no where, so I cut my loses.

After getting off the phone I looked at how much information Nokia wanted so I could sign up for My Nokia. There was no way I was going to give any company that much information just to stop them spamming me - they have the identity theft jackpot questions all there.

On Wednesday I phoned the person at ACMA who was handling my complaint. They claim ACMA is complying with the letter of the law as these are not unsolicited commercial messages, but rather factual services messages from a company I have a relationship with. Apparently you can spam people in Australia if the messages are factual. As these are factual messages, Nokia isn't even required to have an opt out option. Although such actions may be legal, I don't think they are a good way to build customer loyalty and confidence in a brand.

I'm not happy with this situation. Based on some quick math, I have spent 4 to 5 hours chasing this, which is time I wasn't billing clients. This means I am pretty much down the cost of a new phone outright. As things stand now, I am not feeling like recommending Nokia to family, friends or clients, instead I am more likely tell this story and discuss the lack of customer service. I am now also very unlikely to buy the N900 I have been admiring on amazon, let alone attend Forum Nokia Developer Conference 2010. Instead I am likely to import a Nexus One or some other open phone.

I hope someone reading this works for Nokia or has a contact there who can resolve this. If anything happens I will post an update.

Below are some of pearls of wisdom I've received from Nokia:

Tip: Automatically adding location information to your pictures means you'll never forget a place. In camera mode, select Options > Settings > Show GPS info

Tip: Find out if a surface is flat by using your device as a spirit level. Download the free Level Touch app by visiting

Tip: Share your favourite places with Nokia Maps. When viewing a map, select a saved place, press Send, and then choose your preferred sending method.

Tip: Use the self-timer to make you don't get left out of the next family portrait. In camera mode select the Capture settings icon and select Self Timer.

Tip: Listen to music in stereo sound and manage your calls with the Nokia Stereo Headset WH-500. Visit [...]

Update 15-Feb-2010 @ 13:00AEDT I emailed Tracy Postill, Corporate Communications Manager at Nokia Australia, a link to my post. She raised the issue with Nokia Care who called me on Freiday evening and told me that they had tried some things, but it would take 2 weeks or so before they knew if it worked or not. I sent a follow up email to Tracy asking why was it so difficult to unsubscribe from My Nokia. I am still waiting on a response from Tracy.

Packaging Doctrine for Debian and Ubuntu

I have been indoctrinated into to the everything on production machines should be packaged school of thought. Rather than bang on about that, I intend to keep this post relatively short and announce that I have created Debian (and Ubuntu) packages for Doctrine, the ORM for PHP.

The packaging is rather basic, it gets installed just like any other Debianised PEAR package, that being the files go in /usr/share/php, the package.xml and any documentation goes into /usr/share/doc/<package>, and the tests are stored as examples in /usr/share/doc/<package>/examples. The generated package will be called php-doctrine_1.2.1-1_all.deb (or similar), to comply with the Debian convention of naming all PEAR packages php-<pear-package-name>_<version>_<architecture>.deb. I have only packaged 1.2.1, but the files can easily be adapted for other versions, some of the packaging is designed to be version agnostic anyway.

To create your own Doctrine deb, follow these steps:

  • Create a directory, such as ~/packaging/php-doctrine-1.2.1
  • Change into the new directory
  • Download my debian/ tarball and extract it in your php-doctrine-1.2.1 directory
  • Download the PEAR package tarball from the project website and extract it in your php-doctrine-1.2.1 directory
  • If you don't already have a standard Debian build environment setup, set one up by running sudo apt-get install build-essential
  • To build the package run dpkg-buildpackage -k<your-gpg-key-id> -rfakeroot . If you don't have a gpg key drop the "-k<your-gpg-key-id>" from the command

Now you should have a shiny new Doctrine deb. I think the best way to deploy it is using apt and private package repository.

Update: @micahg on pointed me to a Doctrine ITP for Debian. Hopefully Federico's work will mean I no longer need to maintain my own packaging of Doctrine.

Howto Setup a Private Package Repository with reprepro and nginx

As the number of servers I am responsible for grows, I have been trying to eliminate all non packaged software in production. Although ubuntu and Debian have massive software repositories, there are some things which just aren't available yet or are internal meta packages. Once the packages are built they need to be deployed to servers. The simplest way to do this is to run a private apt repository. There are a few options for building an apt repository, but the most popular and simplest seems to be reprepro. I used Sander Marechal and Lionel Porcheron's reprepro howtos as a basis for getting my repository up and running.

nginx is a lightweight http server (and reverse proxy). It performs very well serving static files, which is perfect for a package repository. I also wanted to minimise the memory footprint of the server, which made nginx appealing.

To install the packages we need, run the following command:

$ sudo apt-get install reprepro nginx 

Then it is time to configure reprepro. First we create our directory structure:

$ sudo mkdir -p /srv/reprepro/ubuntu/{conf,dists,incoming,indices,logs,pool,project,tmp}
$ cd /srv/reprepro/ubuntu/
$ sudo chown -R `whoami` . # changes the repository owner to the current user

Now we need to create some configuration files.


Origin: Your Name
Label: Your repository name
Codename: karmic
Architectures: i386 amd64 source
Components: main
Description: Description of repository you are creating


basedir .

If you have a package ready to load, add it using the following command:

$ reprepro includedeb karmic /path/to/my-package_0.1-1.deb \
# change /path/to/my-package_0.1-1.deb to the path to your package

Once reprepro is setup and you have some packages loaded, you need to make it so you can serve the files over http. I run an internal dns zone called "internal" and so the package server will be configured to respond to packages.internal. You may need to change the server_name value to match your own environment. Create a file called

with the following content:

server {
  listen 80;
  server_name packages.internal;

  access_log /var/log/nginx/packages-access.log;
  error_log /var/log/nginx/packages-error.log;

  location / {
    root /srv/reprepro;
    index index.html;

  location ~ /(.*)/conf {
    deny all;

  location ~ /(.*)/db {
    deny all;

Next we need to increase the server_names_hash_bucket_size. Create a file called

which should just contain the following line:

server_names_hash_bucket_size 64;

Note: Many sites advocate sticking this value in the http section of the

/etc/nginx/nginx.conf config
file, but in Debian and Ubuntu
is included in the http section. I think my method is cleaner for upgrading and clearly delineates the stock and custom configuration.

To enable and activate the new virtual host run the following commands:

$ cd /etc/nginx/sites-enabled
$ sudo ln -s ../sites-available/packages.internal.conf .
$ sudo service nginx reload

You should get some output that looks like this

Reloading nginx configuration: the configuration file /etc/nginx/nginx.conf syntax is ok
configuration file /etc/nginx/nginx.conf test is successful

Now you can add the new repository to your machines. I recommend creating a file called

and put the following line in the file:

To make the machine aware of the new repository and associated packages, simply run:

$ sudo apt-get update

That's it. Now you have a lightweight package repository with a lightweight webserver - perfect for running in a virtual machine. Depending on your setup you could probably get away with using 256Mb of RAM and a few gig of disk.

Packaging Drush and Dependencies for Debian

Lately I have been trying to avoid non packaged software being installed on production servers. The main reason for this is to make it easier to apply updates. It also makes it easier to deploy new servers with meta packages when everything is pre packaged.

One tool which I am using a lot on production servers is Drupal's command line tool - drush. Drush is awesome it makes managing drupal sites so much easier, especially when it comes to applying updates. Drush is packaged for Debian testing, unstable and lenny backports by Antoine Beaupré (aka anarcat) and will be available in universe for ubuntu lucid. Drush depends on PEAR's Console_Table module and includes some code which automagically installs the dependency from PEAR CVS. The Debianised package includes the PEAR class in the package, which is handy, but if you are building your own debs from CVS or the nightly tarballs, the dependency isn't included. The auto installer only works if it can write to /path/to/drush/includes, which in these cases means calling drush as root, otherwise it spews a few errors about not being able to write the file then dies.

A more packaging friendly approach would be to build a debian package for PEAR Console_Table and have that as a dependency of the drush package in Debian. The problem with this approach is that drush currently only looks in /path/to/drush/includes for the PEAR class. I have submitted a patch which first checks if Table_Console has been installed via the PEAR installer (or other package management tool). Combine this with the Debian source package I have created for Table_Console (see the file attached at the bottom of the post), you can have a modular and apt managed instance of drush, without having to duplicate code.

I have discussed this approach with anarcat, he is supportive and hopefully it will be the approach adopted for drush 3.0.

Update The drush patch has been committed and should be included in 3.0alpha2.

Upcoming Book Reviews

Packt Publishing seem to have liked my review of Drupal 6 Javascript and jQuery, so much so they have asked me to review another title. On my return from and Drupal South in New Zealand, a copy of the second edition of AJAX and PHP was waiting for me at the post office. I'll be reading and reviewing the book during February.

I will cover LCA and Drupal South in other blog posts once I have some time to sit down and reflect on the events. For now I will just gloat about winning a spot prize at Drupal South. I walked away with Emma Jane Hogbin and Konstantin Käfer's book, Front End Drupal. I've wanted to buy this title for a while, but shipping from the US made it a bit too pricey even with the strong Australian Dollar. I hope to start reading it in a few weeks, with a review to follow shortly after.

Got a book for me to review? I only read books in dead tree format as I mostly read when I want to get away from the screen. Feel free to contact me to discuss it further.

Ads don't Belong on your Business Site

Back in the late 90s there was a range of free website hosting options - geocities, angelfire and tripod are the big 3 I remember straight off the top of my head. The business model was pretty simple, you got a free site, albeit with a pretty crappy url, and the host got to inject ads into the page. The first "site" ever I ever built was hosted by tripod and is still up, I have forgotten the login details so it hasn't been updated for 11 years.

Of the 3 stars of this business model, angelfire and tripod are still offering an ad supported version along with ad free, fee for service upgrades, but geocities is dead. Today, the business model has evolved, you can get a free but ad supported blogs (see or, email services (see gMail, Yahoo or Microsoft) or project hosting (see sourceforge, xp-dev or CodePlex) along with many other online services. For personal stuff I think this is fine, and the same goes for small not for profit organisations. On the other hand if you run a business and want to appear professional, profitable and "up with technology", then you don't want your email address to be, or your website to be It could be worse, you could be using an email address or hosting supplied by your ISP such as and In the case of email, you can use google apps for domains and still look professional.

It is different if you are solely providing free (as in beer) content, such as video, news or a professional blog. This is a clear business model, fund free content via advertising, it has been used by print news, radio and television for decades. I also think it is fine for community based free/open source software projects to use it to get some additional revenue. It is different if you are a profit making business.

Not only does running ads on your site look unprofessional, you could be promoting the competition. Google targets their ads based on the content of the page. For example if you are a small shop and you have a page listing the types of products you offer, Google is likely to serve up ads on that page for those products. Do you really want an ad from a competitor showing up with "Cheap [item], free next day delivery"? How many high volume paying customers will you lose for that extra few dollars a month in Ad Sense revenue?

I find it more shocking on large corporate sites. Yes, they attract a lot of eye balls, but I pay my phone company enough money and they make large enough profits, that I shouldn't have to be subjected to ads on their corporate home page. It makes them look cheap. The same goes for smaller businesses.

In the case of a business blog, it should be part of your business website. If people find your blog and they like what they see, they are likely to click around your site to find out more about you. If you have your blog on blogger they are only likely to find other blog posts, and if you have a link to your business website, they will probably stop clicking once they hit your business site as it is completely different from your blog. If on the other hand it's all nicely integrated, your readers are able to move from your blog posts to your business content seamlessly - and so are more likely to become a customer, rather than another bounced visitor.

What are your options? Many hosts offer one click installers for setting up drupal, wordpress or other content management systems. With a bit of help from an online tutorial or 2 you should be able to get drupal up and running with a basic site and a theme from contrib. Sure it will look a bit cheap, but no worse than something on blogger. If you were to host it with dreamhost it is going to cost you around 120USD/130AUD for one year. Even if you have to pay someone to help you setup your CMS site, it will probably cost you less than 500USD in the first year for a basic setup. A basic setup will allow your business to project a professional image to the world. Add a professionally designed custom theme and site build for another 1850USD/2000AUD or so and you are set for a few years. Of course you will spend more if you want some to help with designing your information architecture, help with SEO, produce or proof content or suggest images etc. The investment is likely to pay for itself over that time in increased sales.

Updating all of your Drupal Sites at Once - aka Lazy Person's Aegir

Aegir is an excellent way to manage multi site drupal instances, but sometimes it can be a bit too heavy. For example if you have a handful of sites, it can be overkill to deploy aegir. If there is an urgent security fix and you have a lot of sites (I am talking 100s if not 1000s) to patch, waiting for aegir to migrate and verify all of your sites can be a little too slow.

For these situations I have a little script which I use to do the heavy lifting. I keep in ~/bin/update-all-sites and it has a single purpose, to update all of my drupal instances with a single command. Just like aegir, my script leverages drush, but unlike aegir there is no parachute, so if something breaks during the upgrade you get to keep all of the pieces. If you use this script, I would recommend always backing up all of your databases first - just in case.

I keep my "platforms" in svn, so before running the script I run a svn switch or svn update depending on how major the update is. If you are using git or bzr, you would do something similar first. If you aren't using any form of version control - I feel sorry for your clients.

So here is the code, it should be pretty self explanatory - if not ask questions via the comments.

#!/bin/sh # Update all drupal sites at once using drush - aka lazy person's aegir # # Written by Dave Hall # Copyright (c) 2009 Dave Hall Consulting # # This program is free software; you can redistribute it and/or # modify it under the terms of the GNU General Public License # as published by the Free Software Foundation; either version 2 # of the License, or (at your option) any later version. # Alternatively you may use and/or distribute it under the terms # of the CC-BY-SA license # Change this to point to your instance of drush isn't in your path DRUSH_CMD="drush" if [ $# != 1 ]; then     SCRIPT="`basename $0`"     echo "Usage: $SCRIPT path-to-drupal-install"     exit 1; fi SITES_PATH="$1" PWD=$(pwd) cd "$SITES_PATH/sites"; for site in `find ./ -maxdepth 1 -type d | cut -d/ -f2 | egrep -v '(.git|.bzr|.svn|all|^$)'`; do     if [ -f "${site}/settings.php" ]; then         echo updating $site         $DRUSH_CMD updatedb -y -l $site     fi done # Lets go back to where we started cd "$PWD"

OK, so my script isn't any where as awesome as aegir, but if you are lazy (or in a hurry) it can come in handy. Most of the time you will probably still want to use aegir.


Make sure you make the script executable (hint run chmod +x /path/to/update-all-sites)

If you don't have drush in your path, I would recommend you add it, but if you can't then change DRUSH_CMD="drush" to point to your instance of drush - such as DRUSH_CMD="/opt/drush/drush".

Thanks to Peter Lieverdink (aka cafuego) for suggesting the improved regex.

DRBD on Ubuntu Karmic

Ubuntu 9.10 (aka karmic koala) has a frustrating packaging bug. Even though the stock server kernel includes the DRBD module, the drbd8-utils package depends on drbd8-source. drbd8-source uses DKMS to build the drbd module to match the installed kernel/s. As I stated in the bug report (lp:474660), "really don't like having build-essential installed on production net facing servers, and where possible any productions servers".

As side from personal opinion on whether the module should be bundled or not, the fact is that is bundled and so there is no need for the dependency on drbd8-source. As a work around I have added a meta package to provide drbd8-source, so I don't need to install build-essential and build the module every time a new kernel is installed.

After a quick test it is working well. Here is the


file I used to make it all happen.

Package: dhc-drbd8-source-hack
Version: 0.1
Section: meta
Priority: optional
Architecture: all
Provides: drbd8-source
Maintainer: Dave Hall <>
Description: Package to hack around drbd8-source dependency for drbd8-utils

If you are unsure how to use the control file above, see my recent blog post on building meta packages for ubuntu and debian.

Setting up a private package repository is outside the scope of this post. If you want to set one up, I would recommend Sander Marechal's slightly dated howto - Setting up and managing an APT repository with reprepro. With a few changes I found it worked well.

If you have your own repository running you can simply run

sudo apt-get install dhc-drbd8-source-hack drbd8-utils

, if you don't you can run the following commands

sudo dpkg -i /path/to/dhc-drbd8-source-hack*.deb && sudo apt-get install drbd8-utils

. Either way you should now have drbd8-utils installed on ubuntu karmic without having to install the redundant drbd8-source package.

To take it a step further you could build a meta package to install both drbd8 packages and allow you to have a potentially smoother upgrade to lucid. The meta package would contain the following line

Depends: dhc-drbd8-source-hack drbd8-utils

This is similar to what I now have in my HA server meta package.

<?php print t('hello world'); ?>

My blog is now syndicated on Planet Drupal. I am very excited about this - thanks Simon.

For the last 8 years or so I have been running my own IT consulting business, focusing on free/open source software and web application development. My clients have range from micro businesses up to well known geek brands like SGI. Until recently I lead the phpGroupWare project.

My Drupal profile doesn't really give much of a hint about my involvement with Drupal. My biggest regret is not signing up for a d.o account sooner. I forget when I started using Drupal 4.7, but I liked it straight away. It was the first CMS which worked the way I thought a CMS should work.

Over time I have learned how to get Drupal to do what I want it to do. Due to the massive range of contrib modules I haven't got my hands very dirty hacking on Drupal - yet.

This year I have been involved in a major Drupal project which involves hosting around 2100 sites. Aegir has made a lot of this painless, especially with our 3,000 line install profile. Over the Christmas period I hope to find the time to blog about the setup, parts of it are pretty crazy.

I'll get around to upgrading my site to Drupal 6 one of these days when I get some time, that should coincide with a visual and content refresh. Feel free to check out some of my older Drupal related posts.

Drupal 6 JavaScript and jQuery

I have just finished reading Matt Butcher's latest book, Drupal 6 JavaScript and jQuery, published by Packt Publishing - ISBN 978-1-847196-16-3. It is a good read. It is one of those books that arrived at the right time and left me inspired.

I have always leaned towards Yahoo's YUI toolkit when I need an Ajax framework, while the rest of the time I just bash out a bit of JS to get the job done. The more I use Drupal, the more I have been wanting to find time to get into jQuery. This book has got me motivated to play with jQuery - especially in combination with Drupal.

The book is logically structured and flows well from chapter to chapter. I find Matt's writing style easy to read, he even brought a smile to my face a few times. Matt assumes a basic knowledge of JS and Drupal, but he also provides links so the reader is able to get additional information if their knowledge is lacking. However, a couple of times Matt seemed to switch quite abruptly from assuming a good level of knowledge on a particular topic to explaining what seemed to me to be basic or simple concepts in great detail.

In the first chapter, entitled Drupal and JavaScript, Matt covers the basics of Drupal, its relationship with JavaScript and recommends some essential items for any serious Drupal developer's toolbox. This chapter provides a nice introduction of what is to come in the rest of the book and allows the reader to become acquainted with Matt's style.

Working with JavaScript in Drupal covers the basics of the Drupal coding standards and why sticking to the standard is important. It then moves onto a quick overview of Drupal's theme engine, PHPTemplate, and integrating JS with Drupal themes. I felt that the development practices part of this chapter could have been expanded a bit more and turned into its own chapter. Understanding the basics of theming is critical for being able to follow the rest of the book, but again I think this half of the chapter could have been developed into a separate chapter. Regardless of how the chapter was arranged, the content is well written and provides solid and practical examples.

In jQuery: Do More with Drupal, Matt gives a detailed overview of jQuery and how it is used in Drupal. Although the code sample has limited real world usefulness, it provides the reader with a very clear idea of the power of jQuery and how easy it is to use with Drupal. By the end of this chapter I was left feeling like I wanted to get my hands dirty with jQuery, unfortunately it was after 1am and I had to work the next day.

In Chapter 4, we move onto Drupal's Behaviors, which is covered in great detail. Behaviors are a key part of Drupal's JS implementation and essentially provide an events based hooks system in JavaScript. Once again Matt spends a lot of time explaining this feature, how it works, how to use it and where to learn more. Matt's description of this feature had me thinking "OMG, Drupal behaviours are awesome" throughout the chapter.

Lost in Translations, is the name of a good movie starring Scarlett Johansson and Bill Murray, which I enjoyed watching a few years ago, oh and is also the fifth chapter of the book. I suspect that I am like many English speaking Drupal developers in that I use the basics of the Drupal translation engine, but pay very little attention to how it works as my target audience is English speaking like me. Not only does Matt explain how Drupal's translation system works in both PHP and JavaScript, he makes it clear why all Drupal developers should understand and use the system - regardless of their native/target language/s.

The JavaScript Themeing chapter was a bit of a surprise for me. I was expecting Drupal to have a JS equivalent to PHPTemplate and for this chapter to outline it and provide some code samples. Instead I learn that Drupal has a very simple, and easy to use, JS themeing system. Matt spends some time discussing best practice for themeing content in JS and goes on to provide the code for his own simple yet powerful jQuery based themeing engine for Drupal.

In AJAX and Drupal Web Services, we learn about JSON, XML and XHR in the context of Drupal. Once again Matt demonstrates the ease of using Drupal and jQuery for quickly building powerful functionality.

Chapter 8 is entitled, Building a Module, and covers the basics of building a JS enabled module for Drupal. Matt also discusses when JS belongs in a theme and when it should be part of a module. The cross promotion of his other book Learning Drupal 6 Module Development ramps up a couple of notches in this chapter. I found the plugs a bit irritating (especially as I own a copy of the book), but overall the chapter is loaded with useful information.

The final chapter, Integrating and Extending, leaves the reader with a solid understanding of what can be done to make jQuery even more useful. This chapter provides a nice motivational finish to the book.

At the start of each chapter Matt recaps what has been covered and outlines where the chapter is heading which makes it easy to get back into the book after putting it down for a few days.

This book is definitely not for the copy and paste coder, nor the developer who just wants ready made solutions they can quickly hack into an existing project. Some may disagree, but I think this is a real positive of this book. Matt uses the examples to illustrate certain concepts or features which he wants the reader to understand. I found the examples got me thinking about what I wanted to use JS and jQuery for in my Drupal sites. Although some of the code samples run to several pages, Matt then spends a lot of time explaining what is happening in bit sized chunks, which makes it easy to understand. I also appreciated the links to documentation so I could get the information I'd need to write my own code for my projects.

One thing which always annoys me about Packt books is the glossy ink they use. In some lighting conditions it is too shiny, which makes it annoying to read, especially with a bed side lamp. On the positive side, the paper is solid and easy to turn.

Sprinkled through the book is some cross promotion of other Packt titles, which I have no issue with, it is a good opportunity to try to grab some additional sales. In a couple of the later chapters it becomes a bit too much. I think once or twice per chapter is reasonable.

I really enjoyed reading Drupal 6 JavaScript and jQuery, it is easy to read and the chapters are a size which lend themselves to being read in a session. I think any Drupal developer who wants to get into using JS in their sites/projects would benefit from reading this book. I finished it feeling like I wanted to start doing some hacking. I plan to update this site in the next few months, and now jQuery enabled effects is on the requirements list. I hope I can bump into Matt Butcher at a DrupalCon or somewhere else in my travels so I can buy him a beer to thank him for putting together a quality book.

Disclaimer Packt Publishing gave me a dead tree copy of this book to review it and keep. I'm glad they gave me a good title to review.