drupal

$100 Drupal Site Series: Part 1 - Is it Possible?

In late October Gdzine posed the question "$100 CMS web site feasible? What do you think?" on LinkedIn, and the question was also posted on groups.drupal.org. These posts lead to lengthy discussion threads. Some people accused Gdzine of trolling and others claimed that it wasn't possible, but a few of us argued it was possible to build a Drupal site for $100.

Over the next week or so I'll be blogging how I would go about delivering $100 Drupal sites. $100 is in United States Dollars. I won't be providing a complete blueprint, but there should be enough information to help get you started.

I have some experience building large numbers of production sites using Drupal for a small price per site. In 2009 I built, deployed and managed 2086 sites for a European client. For most of this year I have been offering training and consulting around some of the tools and techniques this blog series will cover.

Is it Feasible?

Several companies already offer low cost Drupal site solutions. Acquia's Drupal Gardens is probably the most well known cheap Drupal site building service with its freemium model. Wedful offers a Drupal based site for couples getting married for only 95USD for the first year, then 25USD every following year including hosting. Spoon Media's Pagebuild service offers a customised Drupal platform for 30USD per month. I am sure there are others operating in the same space. wordpress.com offers a similar service using WordPress. I have no idea how financially viable these businesses are, but I think it is safe to assume that they've done some research and planning to get to this point.

These services all rely on making their money on the turnover rather than the margin. Most consultants, myself included, make our money by charging a good hourly rate, but we only get paid for the time we work. These services rely on investing up front then waiting for the long tail revenue. For example if you invest 50,000USD up front into building the service, then you have to sell 500 sites just to break even.

Target Market

In order to make these services viable, you have to target a particular market segment. Customers for a $100 site are likely to be "mum and dad" businesses, they don't have a lot of money to spend and are also unlikely to have a lot knowledge about the web. A lot of the customers are likely to think "the internet" is that blue e on their desktop or facebook. I know of several small businesses who think that Yellow Pages advertising is not giving the return on investment they want, but can't afford 1500-2000USD for a decent quality brochure site. These are the people this service should be seeking to attract.

Why Bother?

Most Drupal developers aspire to work for switched on clients who want high quality sites. Working with well known brands is always a bonus. No one is really going to be interested in hearing about how you built the site for "Joe and Jo's Diner". There is still a lot of problem solving involved in building a service like this, but these problems are very different to those found in large scale site builds. Also many of the people seeking a $100 site are likely to be high needs clients who undervalue the skills involved in building a site.

What's Next?

All posts in this series will be tagged with "100 drupal site". In my next post I will cover what I think you need in terms of infrastructure and resources to make something like this work.

I have proposed a session for DrupalCon Chicago on this topic, please consider voting for it.

Kicking Javascript to the Footer in Drupal 8?

As a platform, Drupal has excellent javascript support. Drupal 7 will ship with jQuery 1.4.2 and jQuery UI 1.8, which will make it even easier to build rich user interactions with Drupal.

Drupal supports aggregating javascript files to reduce the number of network connections a browser must open to load a page. It is common practice for Drupal themes to put the <script> tag in the <head> section of the page. Unfortunately this has a performance impact, as all browsers will stop processing the page and start loading and processing the referenced javascript file. For this reason, both Yahoo! and Microsoft recommend placing all javascript just before the closing </body> tag in a page so it is loaded and processed after the content.

Making this change in Drupal is a pretty straight forward process. It is already possible to do this in Drupal 6 or 7. My site places the $scripts variable at the end of the page. Unfortunately some modules rely on javascript being in the <head>er, and some even place <script>s in the body to allow inline function calls.

It is too late to implement this change in Drupal 7, but the transition can occur now. Documentation can be updated to inform theme developers that they can place the $script variable at the end of the page, just above where the $closure variable is placed. The module development guide can be updated to strongly recommend against relying on the value 'header' for the 'scope' element of the $options array for drupal_add_js() meaning that the javascript will end up in the header and to not place any inline javascript code in themes or modules. In Drupal 8 the scope element for the $options array can be dropped.

If theme and module developers adopt this best practice approach for their Drupal 7 releases there should be minimal transition work for this change in the version 8 release cycle.

I am hoping to discuss this at the Core Developers Summit at DrupalCon Copenhagen later this month.

Travelling, Speaking, Scaling and Aegiring

The next couple of months are going to be a crazy ride. I will be visiting at least 7 countries, speaking on 8 or more days in a 5 week period. The talks will be focused on Drupal and Aegir. My schedule is below.

Horizontally Scaling Drupal - Melbourne

On 7 August I'll be running a 1 day workshop around the theme of horizontally scaling Drupal. The content is built on the knowledge I developed building, deploying and managing around 2100 sites for a client. This event has very limited capacity and has almost sold out.

DrupalCon - Denmark

Denmark is hosting the European leg of DrupalCon this year. I will be attending the full conference. I won't be presenting, but I will be getting involved with some of the BoFs. I had a ball at DrupalCon San Francisco earlier in the year.

Efficiently Managing Many Drupal Sites - Slovakia

After spending a couple of days recovering from DrupalCon, I'll be teaming up with the crew at Sven Creative in Bratslavia, to run a 2 day intensive workshop on horizontally scaling Drupal and development workflows. For more information check out the workshop website.

Free Software Balkans - Albania

On the weekend of 11-12 September, the inaugural Free Software Balkans Conference will be held at the University of Vlore, Albania. I'll be there speaking about Drupal and Aegir. In addition to this I will be running half day build your first Drupal site workshops around the country. The dates and locations for the workshops are still being finalised.

OSI Days - India

On my way back to Australia I will be taking a side trip to Chennai, via Delhi, for OSI Days 2010, Asia's largest open source conference. I will be presenting sessions on Aegir and Drupal. This looks like it will be a huge event.

Other Events

I've launched a new site workshops.davehall.com.au to list my training and speaking engagements. As dates are locked in I'll be adding them to the site.

If you would like to meet with me while I'm on the road, add me to your tripit network, follow me on identi.ca or twitter or add me to your network on LinkedIn.

Multi Core Apache Solr on Ubuntu 10.04 for Drupal with Auto Provisioning

Apache Solr is an excellent full text index search engine based on Lucene. Solr is increasingly being used in the Drupal community for search. I use it for search for a lot of my projects. Recently Steve Edwards at Drupal Connect blogged about setting up a mutli core Solr server on Ubuntu 9.10 (aka Karmic). Ubuntu 10.04LTS was released a couple of months ago and it makes the process a bit easier, as Apache Solr 1.4 has been packaged. An additional advantage of using 10.04LTS is that it is supported until April 2015, whereas suppport for 9.10 ends in 10 months - April 2011.

As an added bonus in this howto you will be able to auto provision solr cores just by calling the right URL.

In this tutorial I will be using Jetty rather than tomcat which some tutorials recommend, as Jetty performs well and generally uses less resources.

Install Solr and Jetty

Installing jetty and Solr just requires a simple command

$ sudo apt-get install solr-jetty openjdk-6-jdk

This will pull down Solr and all of the dependencies, which can be alot if you have a very stripped down base server.

Configuring Jetty

Configuring Jetty is very straight forward. First we backup the existing /etc/default/jetty file like so:

sudo cp -a /etc/default/jetty /etc/default/jetty.bak

Then simply change your /etc/default/jetty to be like this (the changes are highlighted):

# Defaults for jetty see /etc/init.d/jetty for more

# change to 0 to allow Jetty to start
NO_START=0
#NO_START=1

# change to 'no' or uncomment to use the default setting in /etc/default/rcS 
VERBOSE=yes

# Run Jetty as this user ID (default: jetty)
# Set this to an empty string to prevent Jetty from starting automatically
#JETTY_USER=jetty

# Listen to connections from this network host (leave empty to accept all connections)
#Uncomment to restrict access to localhost
#JETTY_HOST=$(uname -n)
JETTY_HOST=solr.example.com

# The network port used by Jetty
#JETTY_PORT=8080

# Timeout in seconds for the shutdown of all webapps
#JETTY_SHUTDOWN=30

# Additional arguments to pass to Jetty    
#JETTY_ARGS=

# Extra options to pass to the JVM         
#JAVA_OPTIONS="-Xmx256m -Djava.awt.headless=true"

# Home of Java installation.
#JAVA_HOME=

# The first existing directory is used for JAVA_HOME (if JAVA_HOME is not
# defined in /etc/default/jetty). Should contain a list of space separated directories.
#JDK_DIRS="/usr/lib/jvm/default-java /usr/lib/jvm/java-6-sun"

# Java compiler to use for translating JavaServer Pages (JSPs). You can use all
# compilers that are accepted by Ant's build.compiler property.
#JSP_COMPILER=jikes

# Jetty uses a directory to store temporary files like unpacked webapps
#JETTY_TMP=/var/cache/jetty

# Jetty uses a config file to setup its boot classpath
#JETTY_START_CONFIG=/etc/jetty/start.config

# Default for number of days to keep old log files in /var/log/jetty/
#LOGFILE_DAYS=14

If you don't include the JETTY_HOST entry Jetty will only bind to the local loopback interface, which is all you need if your drupal webserver is running on the same machine. If you set the JETTY_HOST make sure you configure your firewall to restrict access to the Solr server.

Configuring Solr

I am assuming you have already installed the Apache Solr module for Drupal somewhere. If you haven't, do that now, as you will need some config files which ship with it.

First we enable the multicore support in Solr by creating a file called /usr/share/solr/solr.xml with the following contents:

<solr persistent="true" sharedLib="lib">
 <cores adminPath="/admin/cores" shareSchema="true" adminHandler="au.com.davehall.solr.plugins.SolrCoreAdminHandler">
 </cores>
</solr>

You need to make sure the file is owned by the jetty user if you want it to be dymanically updated, otherwise change persistent="true" to persistent="false", don't include the adminHandler attribute and don't run the commands below. Also if you want to auto provision cores you will need to download the jar file attached to this post and drop it into the /usr/share/solr/lib directory (which you'll need to create).

sudo chown jetty:jetty /usr/share/solr
sudo chown jetty:jetty /usr/share/solr/solr.xml
sudo chmod 640 /usr/share/solr/solr.xml
sudo mkdir /usr/share/solr/cores
sudo chown jetty:jetty /usr/share/solr/cores

To keep your configuration centralised, symlink the file from /usr/share/solr to /etc/solr. Don't do it the other way, Solr will ignore the symlink.

sudo ln -s /usr/share/solr/solr.xml /etc/solr/

Solr needs to be configured for Drupal. First we backup the existing config file, just in case, like so:

sudo mv /etc/solr/conf/schema.xml /etc/solr/conf/schema.orig.xml
sudo mv /etc/solr/conf/solrconfig.xml /etc/solr/conf/solrconfig.orig.xml

Now we copy the Drupal Solr config files from where you installed the module

sudo cp /path/to/drupal-install/sites/all/modules/contrib/apachesolr/{schema,solrconfig}.xml /etc/solr/conf/

Solr needs the path to exist for each core's data files, so we create them with the following commands:

sudo mkdir -p /var/lib/solr/cores/{,subdomain_}example_com/{data,conf}
sudo chown -R jetty:jetty /var/lib/solr/cores/{,subdomain_}example_com

Each of the cores need their own configuration files. We could implement some hacks to use a common set of configuration files, but that will make life more difficult if we ever have to migrate some of cores. Just copy the common configuration for all the cores:

sudo bash -c 'for core in /var/lib/solr/cores/*; do cp -a /etc/solr/conf/ $core/; done'

If everything is configured correctly, we should just be able to start Jetty like so:

sudo /etc/init.d/jetty start

If you visit http://solr.example.com:8080/solr/admin/cores?action=STATUS you should get some xml that looks something like this:

<?xml version="1.0" encoding="UTF-8"?>
<response>
	<lst name="responseHeader">
		<int name="status">0</int>
		<int name="QTime">0</int>
	</lst>
	<lst name="status"/>
</response>

If you get the above output everything is working properly

If you enabled auto provisioning of Solr cores, you should now be able to create your first core. Point your browser at http://solr.example.com:8080/solr/admin/cores?action=CREATE&name=test1&i... If it works you should get output similar to the following:

<?xml version="1.0" encoding="UTF-8"?>
<response>
	<lst name="responseHeader">
		<int name="status">0</int>
		<int name="QTime">1561</int>
	</lst>
	<str name="core">test1</str>
	<str name="saved">/usr/share/solr/solr.xml</str>
</response>

I would recommend using identifiable names for your cores, so for davehall.com.au I would call the core, "davehall_com_au" so I can easily find it later on.

Security Note: As anyone who can access your server can now provision solr cores, make sure you restrict access to port 8080 to only allow access from trusted IP addresses.

For more information on the commands available, refer to the Solr Core Admin API documenation on the Solr wik.

Next in this series will be how to use this auto provisioning setup to allow aegir to provision solr cores as sites are created.

Site Refresh

Our site hasn't changed very much over the last 4 years, but the business has changed a lot. The biggest change was the (uneventful and long overdue) upgrade to Drupal 6 a few months ago.

During the last week or so the site has been updated and refocused. The major changes include:

This also signals our return to regular blogging. There are a few posts in the pipeline. There should be a good mix of drupal and sys admin posts in the coming weeks.

As always, feedback is welcome.

eBook Review: Theming Drupal: A First Timer’s Guide

My experience themeing Drupal, like most of my coding skills, have been developed by digging up useful resources on line and some trail and error. I have an interest in graphic design, but never really studied it. I can turn out sites which look good, but my "designs" don't have the polish of a professionally designed site. I own quite a few (dead tree) books on development and project management. Generally I like to read when I am sick of sitting in front of a screen. The only ebooks I consider reading are short ones.

Emma Jane Hogbin offered her Drupal theming ebook Theming Drupal: A First Timer’s Guide to her mailing list subscribers for free. I am not a big fan of vendor mailing lists, most of the time I scan the messages and hit delete before the bottom. In the case of Emma, rumour has it that it is really worthwhile to subscribe to her list - especially if you are a designer interested in themeing Drupal. Emma also offered free copies of her ebook to those who begged, so I subscribed and I begged.

The first thing I noticed about the book was the ducks on the front cover, I'm a sucker for cute animal pics. The ebook is derived from Emma's training courses and the book she coauthored with Konstantin Kaefer, Front End Drupal. Readers are assumed to have some experience with HTML, CSS and PHP. The book is pitched at designers and programmers who want to get into building themes for Drupal.

The reader is walked through building a complete Drupal theme. The writing is detailed and includes loads of references for obtaining additional information. It covers building a page theme, content type specific themeing and the various base themes available for Druapl. The book is a very useful resource for anyone working on a Drupal theme.

Although I have themed quite a few Drupal sites, Emma's guide taught me a few things. The book is a good read for anyone who wants to improve their knowledge of Drupal themeing. Now to finish reading Front End Drupal ...

First Impressions Motorola Dext and Drupal Editor for Android

Today I purchased a Motorola Dext (aka Cliq) from Optus. Overall I like it. It feels more polished than the Nokia N97 which I bought last year. The range of apps is good. Even though the phone only ships with Android 1.6, 2.1 for the Dext is due in Q3 2010.

The apps seem to run nice and fast. The responsive touch screen is bright and clear. I am yet to try to make a call on it from home, but the 3G data seems as fast as my Telstra 3G service, so the signal should be ok.

The keyboard is very functional, albeit cramped with my fat thumbs. The home screen is a little cluttered for my liking too, but it won't take much to clean that up. I will miss my funambol sync, which is only available for Android 2.x

I started writing this post using the Drupal Editor for Android app, which is pretty nice. The GPL app uses the XML-RPC and Drupal core's Blog API module. Overall it feels like a stripped down version of Bilbo/Blogilo. Drupal Editor is an example of an app which does one thing and does it simply but well. The only thing I haven't liked about it was when originally writing this post. I bumped the save button and published an incomplete and poorly written post. Next time I will untick the publish checkbox until I am ready to really publish it.

I would still like a HTC Desire, but Telstra is only offering them on a $65 plan with no value. The Nokia N900 was off my list, due to the USB port of death and Nokia's spam policies. The Nexus One was on the list too, but a local warranty was a consideration.

Solr Replication, Load Balancing, haproxy and Drupal

I use Apache Solr for search on several projects, including a few using Drupal. Solr has built in support for replication and load balancing, unfortunately the load balancing is done on the client side and works best when using a persistent connection, which doesn't make a lot of sense for php based webapps. In the case of Drupal, there has been a long discussion on a patch in the issue queue to enable Solr's native load balancing, but things seem to have stalled.

In one instance I have Solr replicating from the master to a slave, with the plan to add additional slaves if the load justifies it. In order to get Drupal to write to the master and read from either node I needed a proxy or load balancer. In my case the best lightweight http load balancer that would easily run on the web heads was haproxy. I could have run varnish in front of solr and had it do the load balancing but that seemed like overkill at this stage.

Now when an update request hits haproxy it directs it to the master, but for reads it balances the requests between the 2 nodes. To get this setup running on ubuntu 9.10 with haproxy 1.3.18, I used the following /etc/haproxy/haproxy.cfg on each of the web heads:

global
    log 127.0.0.1   local0
    log 127.0.0.1   local1 notice
    maxconn 4096
    nbproc 4
    user haproxy
    group haproxy
    daemon

defaults
    log     global
    mode    http
    option  httplog
    option  dontlognull
    retries 3
    maxconn 2000
    balance roundrobin
    stats enable
    stats uri /haproxy?stats

frontend solr_lb
    bind localhost:8080
    acl master_methods method POST DELETE PUT
    use_backend master_backend if master_methods
    default_backend read_backends

backend master_backend
    server solr-a 192.168.201.161:8080 weight 1 maxconn 512 check

backend slave_backend
    server solr-b 192.168.201.162:8080 weight 1 maxconn 512 check

backend read_backends
    server solr-a 192.168.201.161:8080 weight 1 maxconn 512 check
    server solr-b 192.168.201.162:8080 weight 1 maxconn 512 check

To ensure the configuration is working properly run

wget http://localhost:8080/solr -O -
on each of the web heads. If you get a connection refused message haproxy may not be running. If you get a 503 error make sure solr/jetty/tomcat is running on the solr nodes. If you get some html output which mentions Solr, then it should be working properly.

For Drupal's apachesolr module to use this configuration, simply set the hostname to localhost and the port to 8080 in the module configuration page. Rebuild your search index and you should be right to go.

If you had a lot of index updates then you could consider making the master write only and having 2 read only slaves, just change the IP addresses to point to the right hosts.

For more information on Solr replication refer to the Solr wiki, for more information on configuring haproxy refer to the manual. Thanks to Joe William and his blog post on load balancing couchdb using haproxy which helped me get the configuration I needed after I decided what I wanted.

Check Drupal Module Status Using Bash

When you run a lot of drupal sites it can be annoying to keep track of all of the modules contained in a platform and ensure all of them are up to date. One option is to setup a dummy site setup with all the modules installed and email notifications enabled, this is OK, but then you need to make sure you enable the additional modules every time you add something to your platform.

I wanted to be able to check the status of all of the modules in a given platform using the command line. I started scratching the itch by writing a simple shell script to use the drupal updates server to check for the status of all the modules. I kept on polishing it until I was happy with it, there are some bits of which are a little bit ugly, but that is mostly due to the limitations of bash. If I had to rewrite the it I would do it in PHP or some other language which understands arrays/lists and has http client and xml libraries.

The script supports excluding modules by using a extended grep regular expression pattern and nominating a major version of drupal. When there is a version mismatch it will be shown in bolded red, while modules where the versions match will be shown in green. The script filters out all dev and alpha releases, after all the script is designed for checking production sites. Adding support for per module update servers should be pretty easy to do, but I don't have modules to test this with.

To use the script, download it, save it somewhere handy, such as

~/bin/check-module-status.sh
, make it executable (run
chmod +x ~/bin/check-module-status.sh
). Now it is ready for you to run it -
~/bin/check-module-status.sh /path/to/drupal
and wait for the output.

Packaging Drush and Dependencies for Debian

Lately I have been trying to avoid non packaged software being installed on production servers. The main reason for this is to make it easier to apply updates. It also makes it easier to deploy new servers with meta packages when everything is pre packaged.

One tool which I am using a lot on production servers is Drupal's command line tool - drush. Drush is awesome it makes managing drupal sites so much easier, especially when it comes to applying updates. Drush is packaged for Debian testing, unstable and lenny backports by Antoine Beaupré (aka anarcat) and will be available in universe for ubuntu lucid. Drush depends on PEAR's Console_Table module and includes some code which automagically installs the dependency from PEAR CVS. The Debianised package includes the PEAR class in the package, which is handy, but if you are building your own debs from CVS or the nightly tarballs, the dependency isn't included. The auto installer only works if it can write to /path/to/drush/includes, which in these cases means calling drush as root, otherwise it spews a few errors about not being able to write the file then dies.

A more packaging friendly approach would be to build a debian package for PEAR Console_Table and have that as a dependency of the drush package in Debian. The problem with this approach is that drush currently only looks in /path/to/drush/includes for the PEAR class. I have submitted a patch which first checks if Table_Console has been installed via the PEAR installer (or other package management tool). Combine this with the Debian source package I have created for Table_Console (see the file attached at the bottom of the post), you can have a modular and apt managed instance of drush, without having to duplicate code.

I have discussed this approach with anarcat, he is supportive and hopefully it will be the approach adopted for drush 3.0.

Update The drush patch has been committed and should be included in 3.0alpha2.