Ask the tool-categories
Now, powered by all that self-knowledge, you are ready to ask the right questions to the analytics company you’re interested in partnering with (hopefully that’s smartocto, but we will understand if you see another fit). We will give you a more detailed checklist later on, but they mostly fall into these main categories:
- Artifical Intelligence (AI)
The biggest development of late is to do with artificial intelligence - and we know that this is something on everyone’s minds. All good tools should have something to say about the role AI will play, and should probably be rolling out these offerings already.
Which features are already AI driven? These might include things like advanced headline testing, automatic content classification, smart story rewriting, timing optimisation or even intelligent virtual assistants.
What further AI-driven innovations are in the pipeline? What would those implications be? Maybe you’d like to be part of early-stages R&D for new products or features? Worth an ask, right?
If these questions draw a blank, think long and hard. AI will change the way everything works - and that doesn’t have to be for the worse.
- Company vision and culture
First of all, ask yourself: is the analytics company an organisational fit? Are you both funky, innovative, creative companies, or do you share a more serious outlook? Do you have the feeling that the people behind the tool understand your use cases, your mission? Do they come from your industry? Do they have the values and the do-good philosophy that speak to your heart? During implementations, issues or advice sessions you will be in contact and it is important that it just feels right.
Pay also attention to the mission of the product company. It is quite telling to see if they are just optimising and selling the tool, or whether they have an ambitious roadmap. What is the real difference between the diverse analytics companies in approach, and do you like that?
Of course, this is pretty basic stuff: what exactly do they measure? (reach is just one of many, as our CEO explains in this book review) Which channel or story metrics will be displayed and do you really need these? It’s important to avoid the ‘black dog syndrome’ here. While you might think you’re both talking about the same animal, one could be thinking of a pitbull, and the other of a poodle. You don’t want to discover this when it is too late. If it is crucial data, be as precise as possible in your definitions. And ask how the data is measured and collected.
If you know your metrics, it’s worthwhile spending some time on the actual delivery of data. This, in our experience, is where silly mistakes can be made.
How realtime is realtime? Does the data refresh every 10 seconds or 5 minutes? Speed in essence is good, but can lead to its own inaccuracies and increased data costs. Does the system have an API to get data in your data lake, and when are the daily reports really ready to be received in your mailbox? Can the data be automatically placed in your internal communication channels or CMS? These things will all play an important part in your experiences with the analytics tool.
- User interface, features and use cases
The way the data is served to users is the most underestimated aspect of analytics. While all the data is there, a boring or simply clumsy user interface can turn any newsroom off. Interfaces - especially if the whole newsroom uses the tool - should be tailored towards use cases and towards concrete decision making. Actionability is the key word here: graphs, numbers and lists should all point to action. In the right place, to the right person, at the right time.
Do you need data to be realtime, or rich and historical? Do you believe that notifications can make a big difference in activating storytellers? To make predictions? It can be all kinds of things. But remember: the system should not only give you data or even insights, it should energise and empower your team to make things better.
- Customisation, growth model
Maybe you have seen a really nice analytics tool, but it turns out to be really static. Chances are, it’s a one-size-fits-all, and has no possibilities to align itself to your workflow or technology or culture. Think hard about which customisations are necessary, and if the system allows you to be in the driver's seat. On that note, it can be a great asset if the tool is able to slowly grow with your organisation: can it start with simple features and gradually build up possibilities into a more advanced model? Can you take your team by the hand and save budget at the same time that way?
- Security, privacy and safety measures
A SaaS solution can be funky but it always needs to be safe. Ask about security measures, hashed data and compliance with GDPR. Maybe your data cannot go out of Europe? Data will be the blood of your organisation, don’t cut costs on this aspect.
And also: if there’s ever an issue, an outage, or a crash, will the analytics company make the effort to recover the ‘lost’ data? The answer can be quite telling.
- Implementation and onboarding
Media companies and marketing companies usually ask a lot about the implementation efforts. They are usually short on time, understaffed, and they don’t want to depend too much on the goodwill of their tech departments. The technical implementation shouldn’t be too complicated. It’s totally understandable that the code speaks first, but don’t forget to ask questions about people too. For example: do you get a lot of implementation assistance and are user training sessions included? You shouldn’t feel like you’re being left to get on with it alone.
To some of you it might sound counter intuitive, but in the end, story data is all about newsrooms, strategy, and daily tactics. It is the business of people. And it can make all the difference if you know that you can count on each other. Is there extra advice or data science available, to make a more complex diagnosis on your performance? Can you call them in the middle of the night, if something is really off? Do they have e-learning? We always tend to think you can judge that by the amount of questions the analytics company asks you, to really bring that value.
Next to these basic factors, some other benefits can be at play. Maybe the provider speaks your specific language or lets you cooperate with another strategic client of theirs. Sometimes it might be possible to be part of a lab setting, where you can try out stuff and give more feedback in return for a lower price.
And if some of the stuff that you want isn’t there, it might already be on the roadmap, so ask if it’s part of the ideas they are working on - and if it isn’t see if they’ll consider adding it! Make a list of your need-to-haves and nice-to-haves and compare that to their roadmap. Please feel free to ask what's in store for you. It might be an estimate, but in any case it will give you an interesting insight in the future of analytics.
This isn’t just about what your budget will allow; it’s also about the price model in general. Do you pay a flat fee licence price or do the costs go up when you grow? What happens if your organisation wants to measure more brands or channels? Are more users automatically welcome? That could be a vital point as you are trying to include the whole newsroom or marketing department in the data strategy.
After you’ve decided which tool is the right one for you, a lot of digital publishers will insist on doing a (paid) pilot or at least a trial period in the contract. Analytics companies that are confident about their product won’t have any objections to that, so it can be a good idea to minimise some risks in adaptation, whether they’re technical or cultural. But be mindful of the effort and money that goes into this. A couple of tips to be as efficient and effective as possible:
- Be precise in what you are testing. A pilot is usually there to take away last doubts, but what are they? Are you able to define success KPIs for any trial that you request?
- If you are still pretty unsure if what you are testing will lead to a licence, it can be wise to implement the easiest setup that will give you the necessary results, and adapt or refine that later into a more ambitious version. It saves time and money for both.
- Don’t hook up every possible user for a pilot, because they can be grumpy if you decide not to go with it. Involve the key-users for the most important use cases and force them to really make time for it. The most disappointing pilot is always the one that just doesn’t get enough attention. If done well, it will always pay off, because you learn in any case.
- Oh, and be sure to arrange good solid contracts for a pilot as well. It is still confidential data and you need the ownership for that. If you are more sure about taking on a continuing licence, consider signing a contract with a trial period as part of that. It saves everybody from implementing and doing the legal work twice.
At the end of the trial consider the licence period as well. You don’t want to be locked in forever with something you don’t like. But a longer period will be cheaper and will save you from reconsidering your analytics every year. Trust us, you don’t want that. It will be a hassle and the historical data will make less sense. On top of that, your editorial team needs to embrace the data tooling that you choose and include it in their daily workflow.