Blogs

Western Digital to fix Licensing?

Over the last few months months I've been corresponding with Dennis Ulrich of Western Digital (WDC) about my concerns with the EULA for the My Book World Edition (MBWE) and their obligations under the GPL. To say it has been a drawn out process is an understatement.

It has taken some time to get WDC to understand the situation. There has been confusing messages about what the situation is with the EULA, the GPL and what license covers what pieces of code. The bottom line is that currently users must check the header of each file to ascertain which license applies to it, even though the downloads are marked as GPL.

Although WDC is moving slowly, they do seem to be commited to making the situation clearer in the next iteration of their MBWE product line. Based on a recent phone call with Dennis, the legal and engineering teams are working together to ensure various licenses complied with and their software engineers are aware of their obligations.

The next version of the MBWE is likely to ship with a revised EULA and properly inform the users of their rights under the GPL. This text is still being developed. At the same time it is still unclear if WDC will backport their changes to their existing NAS products. As it is a simple string change it shouldn't be too hard for them to dedicate a few resources to audit the code and update the strings.

It is unfortunate that WDC appears to have little interest in developing their MBWE product range as a hacker friendly FOSS product. It appears the licensing fix that will be implemented by WDC will more clearly delinate the FOSS and non free components of the MBWE firmware. Clarity is always an important legal consideration, but doesn't help foster a community. WDC seem to have little interest in fostering a hacker community around their products. This is an unfortunate decision by the company.

Many manufacturers of embedded devices only start releasing source for their firmware after being caught out for violating the GPL. WDC is to be commended for complying with the requirements of the GPL from the start. Altough there is no legal requirement for them to make the web gui code and other non free components available, WDC already does. It would be disappointing if they chose to take a backward step and stopped distributing parts of the firmware.

It is possible, and perfectly legal, for WDC to stop distributing the source for the proprietary components. At the same time it would not take much effort for them to release the whole platform as GPL or another FOSS friendly license. WDC is already required to do a code drop every time they release a new version of their hardware or firmware. I suspect it would be faster and easier to push all code to a public git repository than pick through and dump selected components as tarballs on a website.

WDC already have their support team dealing with customer bug reports. Maintaining a mailing list, a bug tracker, a wiki and maybe a public source code repository on somewhere like gitorious is likely to take less than 1 full time employee. The benefit for WDC would be great.

Not only is the hacker community likely to contribute bugs fixes and propose or even develop new features, they can help increase sales. I'm sure the good will generated by the switch to truly open approach to the MBWE product line would outweigh the cost of the additional resources required.

Let's hope Western Digital's fix is a FOSS friendly one. I will post more news as things progress.

Hello Planet Ubuntu Users and Ubuntu Universe

One of my new (financial) year resolutions was to increase the readership of my blog. As part of this plan I have been trying to get my blog syndicated on relevant planets. The latest on my list has been Planet Ubuntu Users and the associated Ubuntu Universe. About 24 hours ago my blog was added to both aggregators. Thanks Tiago!

I run Ubuntu on my Dell D830 laptop (my primary machine). I run various flavours and versions of Ubuntu on desktops, servers and VMs both for my business and clients.

In 2009 I converted the local community run internet cafe to Ubuntu. When they were running XP there were problems almost weekly, since the switch, the only real issue they have had was a failed automatic update for firefox last week.

As for off the clock Linux time, my kids associate tux with computers, while my partner is more than happy running LTS releases on her laptop. The only desktop OS my mother in law has used is Ubuntu Linux.

I have been active in the Australian Ubuntu LoCo for quite a few years.

My Ubuntu related posts are generally technical howtos. Most of the stuff is system administration related with a few Ubuntu home brew packaging and webapp development tips.

The non tagged feed of this blog contains a good mix of geeky stuff about whatever we are working on at DHC, rants about annoying things and the occasional thing from real life. I hope there is something there that you find useful. Happy reading.

Hello Slicehost Planet

Hi, I'm Dave and it has been 10 days since I ran up a new slice.

I have been using slicehost for over 2 years to host various VMs. I run a Free / Open Source focused IT Consulting and programming business based in Victoria, Australia.

This blog contains a good mix of geeky stuff about whatever we are working on at DHC, rants about annoying things / companies along with the occasional thing from "real life". A lot of the stuff that gets blogged about is running on slicehost VMs or at least has been tested on them.

This blog is now being syndicated on the Slicehost Planet. The future of the Slicehost planet is unclear, support have suggested that it suffer the same fate as USA-193, while others insist it is staying. For now at least they are adding new blogs, so long as they hosted on their VMs. If you have a blog running on a slice, email support at slicehost and ask them to add your feed to the planet.

Multi Core Apache Solr on Ubuntu 10.04 for Drupal with Auto Provisioning

Apache Solr is an excellent full text index search engine based on Lucene. Solr is increasingly being used in the Drupal community for search. I use it for search for a lot of my projects. Recently Steve Edwards at Drupal Connect blogged about setting up a mutli core Solr server on Ubuntu 9.10 (aka Karmic). Ubuntu 10.04LTS was released a couple of months ago and it makes the process a bit easier, as Apache Solr 1.4 has been packaged. An additional advantage of using 10.04LTS is that it is supported until April 2015, whereas suppport for 9.10 ends in 10 months - April 2011.

As an added bonus in this howto you will be able to auto provision solr cores just by calling the right URL.

In this tutorial I will be using Jetty rather than tomcat which some tutorials recommend, as Jetty performs well and generally uses less resources.

Install Solr and Jetty

Installing jetty and Solr just requires a simple command

$ sudo apt-get install solr-jetty openjdk-6-jdk

This will pull down Solr and all of the dependencies, which can be alot if you have a very stripped down base server.

Configuring Jetty

Configuring Jetty is very straight forward. First we backup the existing /etc/default/jetty file like so:

sudo cp -a /etc/default/jetty /etc/default/jetty.bak

Then simply change your /etc/default/jetty to be like this (the changes are highlighted):

# Defaults for jetty see /etc/init.d/jetty for more

# change to 0 to allow Jetty to start
NO_START=0
#NO_START=1

# change to 'no' or uncomment to use the default setting in /etc/default/rcS 
VERBOSE=yes

# Run Jetty as this user ID (default: jetty)
# Set this to an empty string to prevent Jetty from starting automatically
#JETTY_USER=jetty

# Listen to connections from this network host (leave empty to accept all connections)
#Uncomment to restrict access to localhost
#JETTY_HOST=$(uname -n)
JETTY_HOST=solr.example.com

# The network port used by Jetty
#JETTY_PORT=8080

# Timeout in seconds for the shutdown of all webapps
#JETTY_SHUTDOWN=30

# Additional arguments to pass to Jetty    
#JETTY_ARGS=

# Extra options to pass to the JVM         
#JAVA_OPTIONS="-Xmx256m -Djava.awt.headless=true"

# Home of Java installation.
#JAVA_HOME=

# The first existing directory is used for JAVA_HOME (if JAVA_HOME is not
# defined in /etc/default/jetty). Should contain a list of space separated directories.
#JDK_DIRS="/usr/lib/jvm/default-java /usr/lib/jvm/java-6-sun"

# Java compiler to use for translating JavaServer Pages (JSPs). You can use all
# compilers that are accepted by Ant's build.compiler property.
#JSP_COMPILER=jikes

# Jetty uses a directory to store temporary files like unpacked webapps
#JETTY_TMP=/var/cache/jetty

# Jetty uses a config file to setup its boot classpath
#JETTY_START_CONFIG=/etc/jetty/start.config

# Default for number of days to keep old log files in /var/log/jetty/
#LOGFILE_DAYS=14

If you don't include the JETTY_HOST entry Jetty will only bind to the local loopback interface, which is all you need if your drupal webserver is running on the same machine. If you set the JETTY_HOST make sure you configure your firewall to restrict access to the Solr server.

Configuring Solr

I am assuming you have already installed the Apache Solr module for Drupal somewhere. If you haven't, do that now, as you will need some config files which ship with it.

First we enable the multicore support in Solr by creating a file called /usr/share/solr/solr.xml with the following contents:

<solr persistent="true" sharedLib="lib">
 <cores adminPath="/admin/cores" shareSchema="true" adminHandler="au.com.davehall.solr.plugins.SolrCoreAdminHandler">
 </cores>
</solr>

You need to make sure the file is owned by the jetty user if you want it to be dymanically updated, otherwise change persistent="true" to persistent="false", don't include the adminHandler attribute and don't run the commands below. Also if you want to auto provision cores you will need to download the jar file attached to this post and drop it into the /usr/share/solr/lib directory (which you'll need to create).

sudo chown jetty:jetty /usr/share/solr
sudo chown jetty:jetty /usr/share/solr/solr.xml
sudo chmod 640 /usr/share/solr/solr.xml
sudo mkdir /usr/share/solr/cores
sudo chown jetty:jetty /usr/share/solr/cores

To keep your configuration centralised, symlink the file from /usr/share/solr to /etc/solr. Don't do it the other way, Solr will ignore the symlink.

sudo ln -s /usr/share/solr/solr.xml /etc/solr/

Solr needs to be configured for Drupal. First we backup the existing config file, just in case, like so:

sudo mv /etc/solr/conf/schema.xml /etc/solr/conf/schema.orig.xml
sudo mv /etc/solr/conf/solrconfig.xml /etc/solr/conf/solrconfig.orig.xml

Now we copy the Drupal Solr config files from where you installed the module

sudo cp /path/to/drupal-install/sites/all/modules/contrib/apachesolr/{schema,solrconfig}.xml /etc/solr/conf/

Solr needs the path to exist for each core's data files, so we create them with the following commands:

sudo mkdir -p /var/lib/solr/cores/{,subdomain_}example_com/{data,conf}
sudo chown -R jetty:jetty /var/lib/solr/cores/{,subdomain_}example_com

Each of the cores need their own configuration files. We could implement some hacks to use a common set of configuration files, but that will make life more difficult if we ever have to migrate some of cores. Just copy the common configuration for all the cores:

sudo bash -c 'for core in /var/lib/solr/cores/*; do cp -a /etc/solr/conf/ $core/; done'

If everything is configured correctly, we should just be able to start Jetty like so:

sudo /etc/init.d/jetty start

If you visit http://solr.example.com:8080/solr/admin/cores?action=STATUS you should get some xml that looks something like this:

<?xml version="1.0" encoding="UTF-8"?>
<response>
	<lst name="responseHeader">
		<int name="status">0</int>
		<int name="QTime">0</int>
	</lst>
	<lst name="status"/>
</response>

If you get the above output everything is working properly

If you enabled auto provisioning of Solr cores, you should now be able to create your first core. Point your browser at http://solr.example.com:8080/solr/admin/cores?action=CREATE&name=test1&i... If it works you should get output similar to the following:

<?xml version="1.0" encoding="UTF-8"?>
<response>
	<lst name="responseHeader">
		<int name="status">0</int>
		<int name="QTime">1561</int>
	</lst>
	<str name="core">test1</str>
	<str name="saved">/usr/share/solr/solr.xml</str>
</response>

I would recommend using identifiable names for your cores, so for davehall.com.au I would call the core, "davehall_com_au" so I can easily find it later on.

Security Note: As anyone who can access your server can now provision solr cores, make sure you restrict access to port 8080 to only allow access from trusted IP addresses.

For more information on the commands available, refer to the Solr Core Admin API documenation on the Solr wik.

Next in this series will be how to use this auto provisioning setup to allow aegir to provision solr cores as sites are created.

Site Refresh

Our site hasn't changed very much over the last 4 years, but the business has changed a lot. The biggest change was the (uneventful and long overdue) upgrade to Drupal 6 a few months ago.

During the last week or so the site has been updated and refocused. The major changes include:

This also signals our return to regular blogging. There are a few posts in the pipeline. There should be a good mix of drupal and sys admin posts in the coming weeks.

As always, feedback is welcome.

eBook Review: Theming Drupal: A First Timer’s Guide

My experience themeing Drupal, like most of my coding skills, have been developed by digging up useful resources on line and some trail and error. I have an interest in graphic design, but never really studied it. I can turn out sites which look good, but my "designs" don't have the polish of a professionally designed site. I own quite a few (dead tree) books on development and project management. Generally I like to read when I am sick of sitting in front of a screen. The only ebooks I consider reading are short ones.

Emma Jane Hogbin offered her Drupal theming ebook Theming Drupal: A First Timer’s Guide to her mailing list subscribers for free. I am not a big fan of vendor mailing lists, most of the time I scan the messages and hit delete before the bottom. In the case of Emma, rumour has it that it is really worthwhile to subscribe to her list - especially if you are a designer interested in themeing Drupal. Emma also offered free copies of her ebook to those who begged, so I subscribed and I begged.

The first thing I noticed about the book was the ducks on the front cover, I'm a sucker for cute animal pics. The ebook is derived from Emma's training courses and the book she coauthored with Konstantin Kaefer, Front End Drupal. Readers are assumed to have some experience with HTML, CSS and PHP. The book is pitched at designers and programmers who want to get into building themes for Drupal.

The reader is walked through building a complete Drupal theme. The writing is detailed and includes loads of references for obtaining additional information. It covers building a page theme, content type specific themeing and the various base themes available for Druapl. The book is a very useful resource for anyone working on a Drupal theme.

Although I have themed quite a few Drupal sites, Emma's guide taught me a few things. The book is a good read for anyone who wants to improve their knowledge of Drupal themeing. Now to finish reading Front End Drupal ...

Flight Report MEL > SYD > SFO on United

To get to DrupalCon, I flew with United on UA840, then onto UA870 on Wednesday. I went with them for 2 reasons, they were cheap and I would earn miles on Thai. I was a little disappointed that my budget didn't stretch to Air New Zealand, I was looking forward to flying with them again after an excellent experience in January. I was really impressed with United.

At check-in I used the business counter, one of the benefits of Gold status. The agent was really friendly and answered my questions about security requirements when flying to the US. When I asked about being moved up to Economy Plus using my Thai Gold status the agent checked and gave me an aisle seat. I was looking forward to the extra 5 inches of leg room.

Next was off to the Air NZ lounge for some pancakes for breakfast - I love that machine. The Air NZ staff were friendly as always.

Boarding was delayed by 10 minutes or so, but staff kept people updated. Take off was really delayed, with not much explanation. The snack was ok, pretzels and a juice, don't expect much more on such a short flight.

In Syndey I had a light lunch in the Air NZ lounge, then off to boarding for San Francisco. There was a queue for economy boarding, while the premium queue was empty. One of the benefits of Star Alliance Gold status is supposed to be priority boarding, but it seems United only offers this to their own elites.

The inflight entertainment on United's 747s is awful, they only offer a shared screen and the radio options are really limited. Good thing I packed some book and my laptop. Dinner was pretty bad, beans, peas, corn and stale mushrooms with some tasteless sauce that was supposed to a curry and rice, the baked beans for breakfast was ok. Through trial and error I have learnt that AVML (Asian/Indian Vegetarian) meals are usually the best vego option, but I will be changing my selection for my return flight.

What really impressed me was the staff, like Air NZ's flight attendants, they seem like real people. The flight attendants engage with the passengers and treat them like real people. My ability to open the economy red wine bottles became a bit of a running joke with one flight attendant. After chatting with staff about Napa valley reds, my glasses of wine started to come from the front of the plane, which was very nice. Tom, the economy purser was happy to have a chat with me about United, which inspired me to write this post. When my laptop battery died the flight attendants let me charge it in business class, unfortunately I checked my AU > US power adaptor so I couldn't do it. I didn't really sleep on the flight, but I enjoyed just about every minute of it, thanks to the good service. The spare seat next to me also gave me some extra room to spread out which is always handy in Economy.

United's planes might be old, the entertainment just as dated, but the staff make up for that. I hope the people who went out of their way to make my flight as enjoyable and comfortable as possible for me get the recognition they deserve, even if they did bend the rules a little.

First Impressions Motorola Dext and Drupal Editor for Android

Today I purchased a Motorola Dext (aka Cliq) from Optus. Overall I like it. It feels more polished than the Nokia N97 which I bought last year. The range of apps is good. Even though the phone only ships with Android 1.6, 2.1 for the Dext is due in Q3 2010.

The apps seem to run nice and fast. The responsive touch screen is bright and clear. I am yet to try to make a call on it from home, but the 3G data seems as fast as my Telstra 3G service, so the signal should be ok.

The keyboard is very functional, albeit cramped with my fat thumbs. The home screen is a little cluttered for my liking too, but it won't take much to clean that up. I will miss my funambol sync, which is only available for Android 2.x

I started writing this post using the Drupal Editor for Android app, which is pretty nice. The GPL app uses the XML-RPC and Drupal core's Blog API module. Overall it feels like a stripped down version of Bilbo/Blogilo. Drupal Editor is an example of an app which does one thing and does it simply but well. The only thing I haven't liked about it was when originally writing this post. I bumped the save button and published an incomplete and poorly written post. Next time I will untick the publish checkbox until I am ready to really publish it.

I would still like a HTC Desire, but Telstra is only offering them on a $65 plan with no value. The Nokia N900 was off my list, due to the USB port of death and Nokia's spam policies. The Nexus One was on the list too, but a local warranty was a consideration.

ACMA Investigates Nokia for SMS Spam

The ongoing saga of Nokia's txt spam continues.

The bad news is that I received another txt from Nokia today. This comes after being told by Nokia that I would no longer receive any txts from them. The message reads:

Tip: Use less battery power and help conserve energy with a few helpful tips from Nokia. Vist http://environment.nokia.mobi to learn more.

I have an energy saving tip for Nokia, stop sending txt messages I don't want, then I won't waste energy on trying to make them stop.

Now for the good news. A little while ago ACMA told me that they were preparing to launch a formal investigation into Nokia's SMS messages under the Spam Act. It's now official. Late last Friday I received the following email from the ACMA Investigator handling the matter:

Dear Mr Hall

I write with reference to your complaint #XXX concerning allegations of breaches of the Spam Act 2003 (Spam Act). The Australian Communications and Media Authority (ACMA) has now commenced an investigation into Nokia Australia Pty Ltd about potential contraventions of the Spam Act. During the course of this investigation, the ACMA may require you to provide more information about any dealings you have had with Nokia Australia Pty Ltd and potentially complete a witness statement. The Anti-Spam Team will contact you in due course if this is required. The Anti-Spam Team does not provide your personal information to the business apart from the electronic account information (mobile telephone number) you have already provided. As I am sure you can appreciate, the ACMA is not able to disclose details of the investigation with you, but will advise you when an outcome has been determined. On behalf of the ACMA, I appreciate your assistance in this investigation and thank you for your cooperation.

Information about the Spam Act is available on our website at www.spam.acma.gov.au Please contact us if you have any queries.

I wonder how much energy Nokia will put into defending itself to ACMA.

Watch this space for more news.

X Mail Headers on identi.ca

A while ago I submitted a patch for statusnet, the code that powers identi.ca and other microblogging platforms. The patch added X headers to email notification messages, to make it easier to filter messages. The headers look something like this:

X-StatusNet-MessageType: subscribe
X-StatusNet-TargetUser: skwashd
X-StatusNet-SourceUser: username
X-StatusNet-Domain: identi.ca

The patch was included in the 0.9.1 release of statusnet, and is now running on identi.ca. I think this is the highest traffic site running any of my code. I am pretty excited about it.

This is one of the reasons why I love free software cloud services. Instead of just being a passive consumer of the service, you can actively contribute to its development.

You can run it too by downloading statusnet 0.9.1. Enjoy.

Now to setup some mail filters in Zimbra ...