Bounce Rate Demystified

From Kissmetrics

“If you are doing business on the web and have Google Analytics setup for your website, it’s very likely that you know the bounce rate for your website. But, do you know anything about how it’s calculated, what your industry’s average bounce rate is or even what factors affect your bounce rate? Inspired by common questions that we’ve heard, this infographic is meant to give you answers and some tips to help you improve your bounce rate.”

Bounce Rate Demystified

Advertisements

20 December 2012 at 22:33 Leave a comment

Periodic Table of Website Conversion

Periodic Table of Website Conversion

18 November 2012 at 23:02 Leave a comment

Designing Search: As-You-Type Suggestions

Designing search: nice summary of autocomplete -> autosuggest -> instant results by Tony Russell-Rdse. uxmag.com/articles/desig…

Summary here:

Although most of us are familiar with Google’s iconic “I’m Feeling Lucky” button, few of us use it; we know that search problems of any complexity require an iterative approach, comprising the creation and reformulation of queries. As-you-type suggestions have become invaluable in this experience. Auto-complete is better suited for known-item search and simple information retrieval tasks. Auto-suggest works well for exploratory search and complex information seeking tasks. And instant results provide a direct channel from queries to answers.

Use auto-complete to:

  • Facilitate accurate and efficient data entry
  • Select from a finite list of names or symbols

Use auto-suggest to:

  • Facilitate novel query reformulations
  • Select from an open-ended list of terms or phrases
  • Encourage exploratory search (with a degree of complexity and mental effort that is appropriate to the task). Where appropriate, complement search suggestions with recent searches

Use instant results to:

  • Promote specific items or products

17 May 2012 at 21:58 Leave a comment

Search Analytics for Content Strategists

rosenfeld

Continue Reading 12 May 2012 at 18:28 Leave a comment

Analyse This

How do you know when your website works? When it comes to improving public services the answer may not always be obvious – but it is vital to set targets and measure against them. Dan Jellinek reports from ITU Live.

It is important for public bodies to know whether or not their websites are working, but managers sometimes focus on softer measures which are hard to interpret such as ‘how many visits did we get?’ Peter Jordan, product analytics lead in the Government Digital Service (GDS) delivery team, told the ITU Live panel.

For an e-commerce website the hard measure is obvious – sales – but for public sector sites the equivalent is to look for tasks completed, Jordan said. “Like how many people started out on a transaction to renew car tax and completed it: that’s analogous to buying a pair of shoes.”

Nick Breeze, senior customer insight manager at the GDS, said the importance of online task completion is that otherwise “they
may need to pick up the phone, which is very costly”. One method the GDS has used to measure task completion, alongside more established methods such as live user testing, is to issue tracking software to larger numbers of people which can monitor their web use at home and report on what keystrokes they make, where they go and how long it takes to complete a task.

“One of the reasons we started to use it are the sheer cost effi ciencies. We’re tracking 1,600 users, which is costing us a fraction of
what it would face-to-face,” Breeze said. However, the most valuable insights come from combining different kinds of data, such as tying together what people say in online surveys with what they are actually did, said Alex Henry, online segmentation and personalisation expert at Adobe UK. So if someone reports a poor experience with a form, and then you can see they spend 10 minutes on page 4 of that form, you know where the improvements are needed. Even more sophisticated data matching is needed to track the correlations between website use and call centre use, Henry said.

“You could just look at the raw data and say ‘Here are my call centre numbers, here’s the activity I’ve done online, now let’s compare in
a few weeks’ time to see if my call centres have gone down’. Or you can go down to the level where you’re giving web users a tailored phone number for example, once they get to a certain point in their journey, so that you can prep the call centre.

“So we have the same identifi er in different data sources to say, ‘Here’s someone who is online, here’s how they interacted with the call centre, and here’s the summation of what happened’. We’re seeing a lot of effort at the moment now where clients are taking online behaviour, tying it in with online survey responses and then comparing that with call centre data to see what’s going wrong.”

It is also important to combine and consider many sources of data when designing a website, said Jordan, including traffic data; the public body’s own business needs, and another hugely important source of information on what questions citizens want answered: search data. “In terms of web traffi c about 60% comes from web search: Google, Bing, Yahoo and so on – by far the biggest portion. So making sure you’re optimised for search is really important.”

There is plenty of good free material on search engine optimisation out there provided by Google and others, Jordan said – “just search for it”. Basically it breaks down into a research phase and production phase. “So say you’re talking about mountains, you should research the terminology people are actually using – they might be using peaks rather than mountains – or you may see there is a big gap about the Pyrenees where everybody is writing about the Alps, and then build that terminology into what you’re writing about.” Other important factors include increasing your links with social media, he said.

Finally, the panel turned to the controversial subject of cookies – small ‘strings’ or pieces of information delivered by a website and saved locally onto a user’s machine to help determine when the same user is returning, and tailor the response accordingly. “There is a lot of misconception about what a cookie actually contains,” Henry said. “Take my mum, for example, who might browse a leading retailer’s website: she’ll then go back to her web-based email account and, on the right-hand side, she’ll see a very similar object to what she’s just browsed. And her interpretation of that is, ‘Oh they were watching me, I don’t like that’. “But actually, the advert’s always going to be there, are you happy for it to be more tailored to you, or would you just want something generic, and do you really mind? It’s still your choice whether you interact with that advert or not. Because it’s more relevant, people think the cookie’s reading everything they’re doing, which it absolutely isn’t.”

Cookies are useful for government sites because they add detail to information on customer journeys, Jordan said. “There’s stuff that analytics can tell you without cookies, but you do lose sight of that anonymised personal information. To me, the way forward is around education and transparency about what you’re doing with cookies.” During the live session, an online poll of viewers delivered some interesting results on whether or not public sector website owners have already taken action to implement the EU privacy directive, which stipulates that all
UK organisations must obtain consent from their website users to place cookies on their computers by May 2012. Only around 8% of respondents said they already had new cookie policies, while 46% said they had not yet, but intended to in 2012; and the same number said they were waiting for further advice from the UK government before taking action.

So while all respondents accepted that some kind of action is bound to be needed, many are still waiting for clear guidance on cookies. But with just a few months to go to the compliance deadline there is not a lot of time to play with: cookies are now a hot topic.

http://www.ukauthority.com/Portals/0/ITU/JanFeb2012/ITU_JanFeb_2012_ITULive_Cookies_Citizens.pdf

12 February 2012 at 20:36 Leave a comment

Google: old algorithms not fit for purpose – now it’s social and structure

Just read two thoughtful articles by Peter YaredGoogle already knows its search sucks (and is working to fix it) and

Why Google is ditching search. And also John Batelle on It’s not about search anymore its about deals.

These articles look at deeper issues that Google integrating Google+ into search results, but Google+ is. perhaps a wake up call to what’s happening.

Batelle says, “…search is supposed to be about showing the best results to consumers based on objective (or at least defensible and understandable) parameters, parameters unrelated to the search engine itself.”

But as Yared points out, Google has always been a bit social “if it helps, you can think of PageRank as a kind of paleo-social search–just one that moves way too slowly for the modern Web”. Search results are also increasingly gamed and commercialised – via  paid search, content and link farms and organisations’ investment in search engine optimisation.

So Google has been increasingly refocusing its results, reducing the value of signals from PageRank and links and pushing ‘regular’ results down and, often, below the fold.

(Credit: Peter Yared/CNET)

With predictive search – suggesting or pushing users to certain queries, and a greater prominence to answers, Google has the opportunity to reduce the long tail and monetise the search results page further. For organisations seeking to maintain their visibility in Google, there will need to be a move away from ‘traditional’ search engine optimisation. They will need to publish structured data that can seed ‘answers’ and publish content that encourages engagement in social media by potential audiences and, of course, engage themselves.

14 January 2012 at 20:46 Leave a comment

Cloud analytics from Amazon?

Interesting piece in the New York Times
Will Amazon Offer Analytics as a Service?
  • Meryl Schenker for The New York Times.

Amazon.com, through its Web services business, stores vast amounts of data for companies worldwide. Does it want to start analyzing it too?

Specialists in data science say the company has become increasingly interested in the business models of firms that make and sell pattern-finding algorithms for extremely large data sets. They theorize that Amazon wants to move beyond its cloud services businesses — which rents data storage and raw computing power — and add to these offerings analysis software that can be rented, and possibly modified, to suit a company’s needs.

“Amazon has the expertise and the computing power to do something like this,” says Kyle McNabb, a vice president at research firm Forrester. “They could rent an analytics engine to people on a quarterly basis, possibly offer to match your data to other large data sets and find something useful.”

A spokesman for Amazon had no comment on its plans, which he termed “rumor and speculation.”

It would not be difficult for Amazon to offer such a service, since many of the company’s major products are already on Amazon Web Services, and other legacy applications are being moved there. That means that data management tools like Map Reduce (currently a feature in Amazon Web Services), payment security and fraud detection software, and Amazon’s product recommendation engine could all be in the system.

While prices are dropping for the predictive and analytic software offered by the likes of SAS and EMC, the products are generally considered somewhat expensive. Amazon could remove the higher-value proprietary features from its software and sell a cheap simplified version, in the way Google created its Google Analytics Web site service in order to increase the attractiveness of its advertising-based business.

“They are particularly interested in fraud detection and product recommendation, which are proven valuable things,” says the chief executive of an analytics software company who requested anonymity because of his ongoing business discussions with Amazon. “They’re very interested in how they can grow this.”

Amazon Web Services has so far concentrated mostly on offering the raw materials of storage and computing to engineering teams, the so-called “back end” of computing. A move to typically higher-value “front end” products for people in areas like finance or marketing is likely, both for Amazon and others in the cloud computing business. Amazon has already gone from selling books to offering services whereby authors can publish their works directly on e-readers, operating Web sites for other companies, and purchased outfits like the Zappos shoe retailer to sell a broader array of goods.

Given the large amounts of data, and the few people qualified to make sense of it, “someone will offer this as a service, maybe a lot of companies” said Mr. McNabb of Forrester. “Oracle has the assets to do it, but I’m not sure they are interested. I.B.M. has the assets, and Apple could if they wanted to, with their understanding of customer behavior. Amazon is a very good candidate to make it work.”

via bits.blogs.nytimes.com

5 January 2012 at 12:47 Leave a comment

Older Posts Newer Posts


Categories

October 2017
M T W T F S S
« Sep    
 1
2345678
9101112131415
16171819202122
23242526272829
3031  

Twitter Updates