Jakob Nielsen strikes a combative note about most users’ search skills.
Most users are unable to solve even halfway complicated problems with search. Better to redirect their efforts into more supportive user interfaces when possible.
“If you are doing business on the web and have Google Analytics setup for your website, it’s very likely that you know the bounce rate for your website. But, do you know anything about how it’s calculated, what your industry’s average bounce rate is or even what factors affect your bounce rate? Inspired by common questions that we’ve heard, this infographic is meant to give you answers and some tips to help you improve your bounce rate.”
Designing search: nice summary of autocomplete -> autosuggest -> instant results by Tony Russell-Rdse. uxmag.com/articles/desig…
Although most of us are familiar with Google’s iconic “I’m Feeling Lucky” button, few of us use it; we know that search problems of any complexity require an iterative approach, comprising the creation and reformulation of queries. As-you-type suggestions have become invaluable in this experience. Auto-complete is better suited for known-item search and simple information retrieval tasks. Auto-suggest works well for exploratory search and complex information seeking tasks. And instant results provide a direct channel from queries to answers.
Use auto-complete to:
- Facilitate accurate and efficient data entry
- Select from a finite list of names or symbols
Use auto-suggest to:
- Facilitate novel query reformulations
- Select from an open-ended list of terms or phrases
- Encourage exploratory search (with a degree of complexity and mental effort that is appropriate to the task). Where appropriate, complement search suggestions with recent searches
Use instant results to:
- Promote specific items or products
How do you know when your website works? When it comes to improving public services the answer may not always be obvious – but it is vital to set targets and measure against them. Dan Jellinek reports from ITU Live.
It is important for public bodies to know whether or not their websites are working, but managers sometimes focus on softer measures which are hard to interpret such as ‘how many visits did we get?’ Peter Jordan, product analytics lead in the Government Digital Service (GDS) delivery team, told the ITU Live panel.
For an e-commerce website the hard measure is obvious – sales – but for public sector sites the equivalent is to look for tasks completed, Jordan said. “Like how many people started out on a transaction to renew car tax and completed it: that’s analogous to buying a pair of shoes.”
Nick Breeze, senior customer insight manager at the GDS, said the importance of online task completion is that otherwise “they
may need to pick up the phone, which is very costly”. One method the GDS has used to measure task completion, alongside more established methods such as live user testing, is to issue tracking software to larger numbers of people which can monitor their web use at home and report on what keystrokes they make, where they go and how long it takes to complete a task.
“One of the reasons we started to use it are the sheer cost efﬁ ciencies. We’re tracking 1,600 users, which is costing us a fraction of
what it would face-to-face,” Breeze said. However, the most valuable insights come from combining different kinds of data, such as tying together what people say in online surveys with what they are actually did, said Alex Henry, online segmentation and personalisation expert at Adobe UK. So if someone reports a poor experience with a form, and then you can see they spend 10 minutes on page 4 of that form, you know where the improvements are needed. Even more sophisticated data matching is needed to track the correlations between website use and call centre use, Henry said.
“You could just look at the raw data and say ‘Here are my call centre numbers, here’s the activity I’ve done online, now let’s compare in
a few weeks’ time to see if my call centres have gone down’. Or you can go down to the level where you’re giving web users a tailored phone number for example, once they get to a certain point in their journey, so that you can prep the call centre.
“So we have the same identiﬁ er in different data sources to say, ‘Here’s someone who is online, here’s how they interacted with the call centre, and here’s the summation of what happened’. We’re seeing a lot of effort at the moment now where clients are taking online behaviour, tying it in with online survey responses and then comparing that with call centre data to see what’s going wrong.”
It is also important to combine and consider many sources of data when designing a website, said Jordan, including trafﬁc data; the public body’s own business needs, and another hugely important source of information on what questions citizens want answered: search data. “In terms of web trafﬁ c about 60% comes from web search: Google, Bing, Yahoo and so on – by far the biggest portion. So making sure you’re optimised for search is really important.”
There is plenty of good free material on search engine optimisation out there provided by Google and others, Jordan said – “just search for it”. Basically it breaks down into a research phase and production phase. “So say you’re talking about mountains, you should research the terminology people are actually using – they might be using peaks rather than mountains – or you may see there is a big gap about the Pyrenees where everybody is writing about the Alps, and then build that terminology into what you’re writing about.” Other important factors include increasing your links with social media, he said.
Finally, the panel turned to the controversial subject of cookies – small ‘strings’ or pieces of information delivered by a website and saved locally onto a user’s machine to help determine when the same user is returning, and tailor the response accordingly. “There is a lot of misconception about what a cookie actually contains,” Henry said. “Take my mum, for example, who might browse a leading retailer’s website: she’ll then go back to her web-based email account and, on the right-hand side, she’ll see a very similar object to what she’s just browsed. And her interpretation of that is, ‘Oh they were watching me, I don’t like that’. “But actually, the advert’s always going to be there, are you happy for it to be more tailored to you, or would you just want something generic, and do you really mind? It’s still your choice whether you interact with that advert or not. Because it’s more relevant, people think the cookie’s reading everything they’re doing, which it absolutely isn’t.”
Cookies are useful for government sites because they add detail to information on customer journeys, Jordan said. “There’s stuff that analytics can tell you without cookies, but you do lose sight of that anonymised personal information. To me, the way forward is around education and transparency about what you’re doing with cookies.” During the live session, an online poll of viewers delivered some interesting results on whether or not public sector website owners have already taken action to implement the EU privacy directive, which stipulates that all
UK organisations must obtain consent from their website users to place cookies on their computers by May 2012. Only around 8% of respondents said they already had new cookie policies, while 46% said they had not yet, but intended to in 2012; and the same number said they were waiting for further advice from the UK government before taking action.
So while all respondents accepted that some kind of action is bound to be needed, many are still waiting for clear guidance on cookies. But with just a few months to go to the compliance deadline there is not a lot of time to play with: cookies are now a hot topic.
Just read two thoughtful articles by Peter Yared, Google already knows its search sucks (and is working to fix it) and
These articles look at deeper issues that Google integrating Google+ into search results, but Google+ is. perhaps a wake up call to what’s happening.
Batelle says, “…search is supposed to be about showing the best results to consumers based on objective (or at least defensible and understandable) parameters, parameters unrelated to the search engine itself.”
But as Yared points out, Google has always been a bit social “if it helps, you can think of PageRank as a kind of paleo-social search–just one that moves way too slowly for the modern Web”. Search results are also increasingly gamed and commercialised – via paid search, content and link farms and organisations’ investment in search engine optimisation.
So Google has been increasingly refocusing its results, reducing the value of signals from PageRank and links and pushing ‘regular’ results down and, often, below the fold.
(Credit: Peter Yared/CNET)
With predictive search – suggesting or pushing users to certain queries, and a greater prominence to answers, Google has the opportunity to reduce the long tail and monetise the search results page further. For organisations seeking to maintain their visibility in Google, there will need to be a move away from ‘traditional’ search engine optimisation. They will need to publish structured data that can seed ‘answers’ and publish content that encourages engagement in social media by potential audiences and, of course, engage themselves.