data analyst

Apples and pears.

Comparative graphs are useful to highlight areas of improvement. They also need to be meaningful and logical to justify your argument. Avinash Kaushik, Analytics guru, provides a clear example where the analyst compares 4 segments: Search Traffic, Referral, Mobile and Social Media. The first 3 segments focus on the traffic source while the last one focuses on the type of device used; the segments are wrongly calibrated. Visitors coming from these traffic sources could have well been using their smartphone, and bang: they are counted twice! The mobile visits need to be compared against other device specific visits, such as desktop and tablet.

Unnecessary overemphasis.

There are always some fluctuations in data results but there's no need to panic as long as you know the difference between fluctuations and drops. One of our favourite examples is the weekly SEO ranking made by a local marketing agency. Every week they would ask us in great anxiety to review the fluctuations in their data. First, let’s plant the decor: A couple of new items were added to the website each month after a long approval process with the client. As a result the website was slow to address new market trends, consequently causing fluctuations in ranking when these trends were addressed by a competitor first. The weekly emphasis on each menial change damaged the client’s trust and led to publishing even less items on the website, which had a negative impact on the overall ranking.

 

Stating the obvious.

Reports are cool. They show you exactly where you are and where you’ve been. But don’t stop at collecting data only: Reports need to drive business decisions.
An online fashion retailer based in the West Midlands runs weekly reports on their European markets. Every week the teams select some of their data to share with the board. The figures are not commented but read only – ‘This is higher than last week, that dropped, that one remained the same’ -, then the meeting is concluded. This common practice is a waste of time. Reports need to highlight possible reasons for the situation and suggestions for the future. Your interlocutors can read the figures just as well as you. You need to deliver the intelligence behind the numbers. 

 

Lack of experiment.

Let’s say you’ve collected your data and run your report. You’ve identified some reasons for your drop in traffic, your slow acquisition results, your low ranking score, etc. Excellent! Now test, test, test! Without experiment, there is no way to know for sure what works and what doesn’t for your target audience, and to be in a position of improving. Luckily for you Google Analytics has got just the right tool to run insightful A/B experiments. 

You want it now.

Success takes time. It can be frustrating to prepare for a new campaign and see little positive results in the first couple of days, even weeks. But let’s be completely honest here: it is a broad market, your competitors are also active, and your target audience needs time and convincing arguments before they choose you. You won’t notice an impact overnight.

 

Insignificant data.

Remember when we said reports are cool? They are but only if you use relevant and significant data. You don’t need to know everything.
As we were consulting in a marketing agency in 2014, a chain of solicitor cabinets wanted to boost their marketing activities. They ordered a full set of data reports of web metrics, market profiling, social sentiments, market shares, keywords, etc. The reports were left untouched for months until the company and the agency asked us to connect the dots. In short, choose your data wisely to support the decisions you need to take.

You lose focus.

The devil is in the detail, and that’s exactly why you can collect all sorts of data. But success is in the overall experience and that’s something that Print Print, a small printing company we met while consulting at Google's Digital Garage, discovered recently. Digitally savvy, they read the latest trends on SEO and web analytics to optimise the visibility of their blog. They understood the importance of original content. They knew by heart how many views their blog posts received and which social media network delivered readership. Yet they didn’t manage to reach the conversions they wanted: The overactive content creation didn't deliver leads. They had simply forgotten to step back and look at their overall market: The competitors, their offers, brand perceptions, market trends and user experiences. Sometimes you simply can’t see the woods for the trees.