Empowered to say ‘No’

Seth Godin writes in his post License to stall about business-to-business sales. He explains that the majority of people you encounter are empowered to stall, to say ‘No’, since they are easier to train and are safer.

The reason this system evolved is straightforward: the yes people are rare in a typical organization, because they have responsibility and power. So they are busy and need to be protected.

I totally see how this works, but Iimmediately thought that this paradigm was equally relevant in the area of Customer Services.

How often is it that – unless we have a very trivial request – the first person we speak to will be able to help us right away? “I’ll just put you through to so-and-so”, or “I need to talk to my manager/supervisor. Can I call you back?” are par for the course.

I am becoming convinced that for any given company, having quality Customer Service is more important than the quality of their product or service. For a large established company, quality Customer Service seems to me to be more important than sales or acquiring new customers. And yet it seems that the larger the company, the poorer the customer service. They have huge outsourced, off-shore, standardised, systems that have you pushing every number on the keypad before you get through to the wrong department, who aren’t able to help you and aren’t able to transfer you to the appropriate person.

These systems are set up to be defensive – the goal is to reduce cost, which means getting rid of the customer as soon as possible. That is not customer service.

Oh, and customer service also includes website usability, IA and quality of your search tool; read Lance Wiggs’ experience with Dell.

I’ve still not heard back from Dell about the dead graphics adapter in my Inspiron notebook, either. But that’s another post.

Posted in Commentary | Leave a comment

Doug Bowman is an outstanding web designer

…who was, until recently, Visual Design Lead at Google.

He now works for Twitter.

It saddens me to see so many people knocking Doug for the less-than-inspiring design work found in many of Google’s products. Take this Gawker / Valleywag article, particularly the comments, for example.

If anyone of these deriders knew of Doug and the standard and quality of the work he produces, they’d be holding their tongues.

It helps to know the full story, so here it is from the horse’s mouth. It also helps to know a bit of background from others, like Joe Clark, and to get a picture of the sort of people that run the show at Google, such as Marissa Meyer.

It’s surprising Doug stuck it out at Google for as long as he did – with such a vast array of products, and what appears to be just one visual designer amongst a multitude of engineers and mathematicians – it must have been an incredibly trying experience.

All the best, Doug, for your move to Twitter! I hope they can really let you go to work, and I can’t wait to see the outcome!

Posted in Commentary, Design | 3 Comments

Making 2 million a year from blogging. Without ads.

Read it here.

Posted in Commentary | Leave a comment

Sunspider JavaScript benchmarking

I’ve had a blog post in draft for ages, but this isn’t it. The draft post has extensive tables of stats that lie half-finished, but it was getting a bit out-of-hand, so here’s a very abridged version.

The post looked at benchmarking the JavaScript performance of various browsers. Here is a sneak preview:

Browser Speed and tolerance
Firefox 3 (v3.0.4) 3772.6ms +/- 1.3%
Google Chrome (v0.4.154.29) 1808.0ms +/- 6.6%
Opera (v10.00alpha) 4696.4ms +/- 3.6%
IE6 (v6.0.2900.5512 or something) 47651.0ms +/- 15.1%
Safari (v3.1.2) 4260.8ms +/- 5.1%

Of course these stats mean nothing without some explanation, so here goes.
I’m using the SunSpider JavaScript benchmark. You should read what it’s benchmarking – it does a bunch of core JavaScript stuff, like raytracing and string operations and stuff, no DOM or browser API tests.
I ran these tests under Windows XP, on a Dell Inspiron 9300.

My draft post was/is going to also include the browsers on my MacBook Pro, as well as results from the Mootools Slickspeed JavaScript framework benchmark tool.

The results are interesting, especially with IE6 thrown into the mix. The separation between the old and new generations of browsers is clear.
Opera 10.00 alpha was only released yesterday, running on a new rendering engine, Presto 2.2, which utilizes Opera’s new futhark JavaScript engine, and so I expected it to be a bit faster than it is; Safari, FF3, and Chrome are all faster.

Google Chrome is considerably faster that the others.

I do most of my work on the Mac, and so use Firefox with its extensive array of invaluable extensions, particularly Joe Hewitt’s Firebug, and Chris Pederick’s Web Developer Toolbar. On the PC, I use Firefox, and occasionally Chrome, when I feel the need for speed.

Posted in Commentary | 3 Comments

Google Chrome

Google have today released their new web browser, Google Chrome, in beta.

Man is it fast!!!! I’ve been using it this evening and it really flies. While it supposedly uses more memory than other browsers, this use of extra memory is due to the fact that it spawns a new process for each tab, so if, for example, some nasty JavaScript on a particular page decides to strangle your browser, justthat one tab will be affected. And this also alledgedly prevents the memory leaks that plague other browsers.

It’s only available to Windows XP/Vista users at this stage; I’m eagerly awaiting the Mac version.

Any, back to some awesome browsing…

Posted in Commentary, Gadgets, Tools | 1 Comment

Goodbye .clearfix, old friend.

You all know the old ‘floated elements inside a container cause the container to collapse’ problem?

Well, up until recently, I’d always just called on an old friend, .clearfix, and he’d sorted it out for me. I met .clearfix three or four years ago, and hardly a project has gone by since where I haven’t required his services. You know, I didn’t really ask him how he did it; he’d just quietly go about his job as I directed him, much like Michael Clayton, but without the gambling problem.

But yesterday, when I asked him to help out a colleague for me, little did I know I’d given him his last assignment.

Something happened early this morning – completely coincidentally. A little bird (or rather a tweet) came by and told me that, despite the fact Blueprint CSS uses .clearfix, .clearfix was no longer the way to go. And to be honest, it was almost a relief; the reason I’d never really asked .clearfix how he did his work was that deep down I knew that he was really a hack, and that if I just turned a blind (or ignorant) eye, then I could just pretend like everything was okay and we could all just carry on getting work done.

.clearfix is indeed inelegant, and really is a hack. And would you believe that the alternative solution is not so tricky…
Using overflow: auto; (or overflow: hidden;) on the container with a width will sort it all out for you.
Rather than explain it all here, I’ll just link to a few articles that have already put further effort into describing this:

So, farewell, .clearfix, you’ve served us well.
…and thank you, MB, for the enlightenment.

Posted in Coding, Link, Tools, Web Standards | 4 Comments

World’s first ten iPhone 3G owners

Queued outside Vodafone’s Queen St, Auckland, store.

Hi John and Ben!

Posted in Commentary | Leave a comment

Vodafone New Zealand release iPhone pricing and details

See the Vodafone New Zealand website.

And wow, the reaction has been nothing less than scathing across the board! I think I saw perhaps one positive tweet, but otherwise there was lots of wailing and gnashing of teeth.

Cost to purchase an iPhone outright:
8GB: $979NZD = $740USD
16GB: $1129NZD = $853USD

Ouch. More detailed pricing and plans are here.

While Vodafone spokeman Kursten Shalfoon says the data prices are sharp by New Zealand standards (and in all honesty, he’s probably right), that’s only because NZ is still behind the rest of the world when it comes to mobile (and fixed line) pricing/performance.
To be fair, a smaller market and fewer market competitors (or as some would say, a monopoly) has dictated this, but people were really expecting something more inline with US pricing.

Ars Technica have a table of global pricing
. While the NZ prices haven’t been added yet, it’s clear that NZ is the most expensive market by a long shot.

Posted in Commentary, Gadgets | 1 Comment

The ASP.NET MVC framework

Jake Scott commented on my previous post and drew my attention to the ASP.NET MVC framework.

Now that Jake mentions it, I do recall hearing something about an ASP.NET MVC framework a while ago.

While this looks like a great step in the right direction, there are already MVC application development frameworks around that can easily produce great front-end code; Ruby on Rails, Django, Cake, just to name a few.

I guess the question isn’t so much ‘why does ASP .NET produce such poor web front-ends?’, but probably something more along the lines of ‘why is the ASP .NET platform chosen to build these sites over other seemingly more suitable platforms or frameworks?’ or even, ‘why are these sites built with a complete disregard for good front-end coding practices?’.

It’s one thing to have the right tools at your disposal, but it’s another to know which tools are the most appropriate for the task at hand, and secondly, how best to use your chosen tools.

Posted in Commentary | 1 Comment

Does developing in .aspx produce bloated code?

This started out as a comment in response to Robbie’s question on my previous post, but thought I’d turn it into another post:

So is it that they are written in aspx which makes them bloated?.

Well, I guess that’s what I’m kinda implying.

Building in the .NET framework with the Visual Studio IDE tends to mean that lots of components and snippets are provided for you. So by default you get things like having all your page content wrapped in a <form> element and you get the lovely __VIEWSTATE hidden input field and stuff like that.

I think that there can be a tendency amongst back-end developers, i.e. .NET, Java developers, to produce something that works well technically, and then take the PSDs that the designer gave them and put them on the front, so you’ve got something that looks like it’s supposed to, and works kinda like it’s supposed to, but with the interface between the front- and back-ends being very ugly, slow, and inefficient. Of course this has an often considerable negative effect for the end user.

While this may be a generalisation, I’ve experienced it first-hand, with developers who struggle to understand, or at least show some care, about good clean lightweight, semantic front-end code that performs well in the browser.
When you’re developing with .NET in an IDE like Visual Studio, you have to put in a bit more effort to get that good code.

Developing in .NET doesn’t necessarily produce bloated code, but I think if you took a look at the average .NET (.aspx) site, I think you’ll find greater code-bloat than in your average hand-crafted-in-TextMate code.

Posted in Coding, Commentary, Design, Tools | 2 Comments