instagram-like twitter-favourite twitter-reply twitter-retweet weather-cloudy weather-night weather-rain weather-snow weather-sunny

12.10.2020 • Berlin Improve the customer experience with personalisation – how it works!

Group of people looking at a screen

User experience, brand loyalty, conversions: personalisation brings many benefits for companies. What do they need to be aware of when introducing technologies? And which marketing automation methods should they consider?

By Rainer Blumenthal

 

Personalisation as an efficiency booster for marketing 

It is now indispensable in online marketing: personalisation. Personalised advertising throughout the customer journey allows companies to address users automatically at the right time and the right touchpoint with relevant, customised content. This targeted communication ensures, among other things, a better user experience, increased loyalty to the brand and higher conversion rates. 

The important thing when putting together a personalised campaign is to design the output quantifiably. In order to prove – and convince stakeholders – that planned investments actually do pay off, it is important to calculate the business value. This includes measuring, evaluating and continually optimising the performance of personalised, data-driven marketing campaigns using predefined objectives and linked key performance indicators (KPIs). Modern technologies are required for this purpose. 

Many companies find it hard, however, to establish the appropriate technologies and define the right objectives and KPIs; not infrequently, the appropriate customer groups have not been defined and content with real added value has not been created. Holistic technological solutions for personalisation come with their own challenges, of course – precisely because they are relatively complex. But they are indispensable if the aim is to remain competitive in future. 

Companies are faced with questions like: Do we only run personalised advertising for users we know? Or do we also target a large number of users for whom we have no login information? Do we do both? How laborious would these scenarios be? Does it make economic sense to purchase third-party navigation data? Should we try to establish an in-house engine/AI that would assign users to specific segments based on click and tracking data? 

This is how companies can approach this subject step by step. 

 

Customer analysis: implicit vs explicit personalisation 

Users and their preferences form the basis for any personalised content. User preferences can be determined in two ways: implicit personalisation and explicit personalisation. Explicit personalisation means that the user is authenticated via login; they have therefore registered beforehand. Information is available such as product preferences, interests, age, residence, etc. Insofar as a majority of users are logged in while they are on the move, this is the ideal way to run customised content. The core data are reliable, the results are quantifiable and the data can – providing they are legally compliant – be used immediately for segmentation and running (keywords: data protection). 

In implicit personalisation, the segment to which a user belongs is predicted – or guessed. Cookies offer the only opportunity to identify a user. Established tracking providers offer technically sophisticated possibilities here. The problem with this is that if a user deletes their cookies, they are then classified as ‘new’ the next time they visit. In addition, security guidelines complicate a cross-domain cookie evaluation. And data protection law should be examined to ensure that there is no potential for heading in the wrong direction. Many tracking and analysis providers now classify their users in segments such as ‘hi-fi lovers’, ‘sales-oriented’ or similar – here it is essential to check whether the use of these preconceived groups is legally permissible in the respective country or is generally authorised by the supplier. 

 

Alternative routes to an in-house user database 

As soon as a user is registered and interacts with the website or app (in future, ‘ecosystem’), companies can use these data as a valid database. If the majority of users move about the ecosystem while not logged in, companies have the possibility, for example, of purchasing data from third-party suppliers (there are suppliers who place cookies per country on important news portals and classify the users according to their news preferences) in order to establish a seamless integration with the analytics system or to conduct analyses themselves and provide the results. Likewise, it is useful to consider the subjects of data management platform (DMP) and customer data platform (CDP). Customer data is aggregated and put into a usable form with these solutions. 

 

Target group definition for running personalised advertising 

How will the target audience be sensibly subdivided following data collection? One possibility is to break it down demographically – perhaps by age or gender. Segmentation according to profession can also be useful. Or subdivision is conducted according to purchasing behaviour and affinities, such as ‘discount lover’ and ‘people who often participate in competitions’. What fits depends on the respective product or offer. Here, the basic principle should be: everything is a prototype. It will become clear whether the right segments have been targeted when looking at the quantifiable objectives at the latest. 

 

Personalisation requires quantifiable objectives and KPIs 

Prior to running personalised advertising, companies should conduct a comprehensive analysis of user behaviour on the platform. Useful values are, for example, bounce rate, length of stay, lead generation to other platforms or the click rate. Only when these foundations are recorded is it possible to draw a direct comparison – and to check whether a user has really seen relevant content, resulting in their following the desired click paths and therefore increasing the conversion rate. An ongoing data-based improvement of the marketing automation process is only possible if a strong emphasis is placed on quantifiability from the start. 

 

Scoring model for tailored targeting 

Depending on the size of the company and business impact, it may be appropriate to use in-house tracking data or to merge these with in-house CRM data. Previous orders could for example be correlated with a user’s behaviour on the website – this can generate new insights that flow optimally into a scoring model. For example: a user who has ordered nappies three times receives 0.1 points per order in the ‘family’ segment, so a total of 0.3 points. If they have also subscribed to the family newsletter (+0.2) and visit articles about ‘family holidays’ (+0.2), they also get points for this. The sum of all scores then results in an affinity of 0.7 for the ‘family’ segment. When displaying content for the user, it is possible to check whether ‘family’ obtains the highest points total among the available segments and therefore whether the content on the subject of ‘family’ should be run – or, for example, whether the ‘sport’ segment shows an affinity of 0.9. In this case, the sports content would be shown. 

 

A dynamic approach to ‘gallant’ personalisation 

As soon as the technical database is ready to use, the company should ensure that access to it is as efficient as possible. After all, content should ultimately be exchanged dynamically during marketing automation. This can be performed on the server, in the client’s browser or even on the app. Either the personalised content for the user is already incorporated into the server’s response or it is downloaded via JavaScript, React or similar. As a content management system, Magnolia CMS is, among others, superbly suited because it can be used to combine standard and headless rendering as desired. It is common practice to run standard content and to exchange this with customised content. With clever placement and transition, the whole thing should feel as fluid as possible for the user. 

 

Ad server approach vs. customised content from the content hub 

Product suites are becoming more and more extensive and therefore more challenging to set up. Experts who are very familiar with the tools used are now required to conduct customisation. The variant that is cheaper for companies over the long term in most cases is constructing an in-house content hub, storing the desired content there and then indexing this with the corresponding metadata. In other words, storing segments and running this content throughout the ecosystem. An institution is also required that operates business logic and determines which content is best suited to the current requesting user. This logic should be accessible neatly encapsulated behind an API because this also facilitates the use of third-party systems, for example. The personalised content from the content hub can be run in a number of ways. For example, headless via REST API – then visual responsibility lies with the consuming party (e.g. if an Android app draws personalised content for the current user). Should the content be shown on several different websites, a streamlined React app would constitute a suitable approach for integration into predefined teaser placements. 

One of the benefits of a customised solution is that the company saves annual (request-based) licence costs. The initial outlay is however considerably higher than for a finished product solution. A healthy mix of purchased tools and in-house constructions is also advisable. It is best not to begin by implementing tracking oneself or constructing a kind of makeshift CRM oneself out of necessity. Those who can afford it can of course fall back on fully integrated enterprise suite solutions. 

 

Automation – including for editing 

Understandably, content must be produced individually per customer segment. Text, images, videos, etc. are centrally stored in a suitable system and furnished with metadata. From there, they should be run in as automated and cross-platform a manner as possible. For the capture, management and distribution of these data, we at the Berlin IBM iX Studio like to use Magnolia CMS and do so frequently.

When it comes to usability and the management of structured data, including running, Magnolia is almost unbeatable in terms of set-up effort. Conducting maintenance in the CMS manually on the individual teaser has proven to be less advisable. In order to minimise long-term effort, there has to be trust in the personalisation engine – so manual logic interventions should be very rare. It is advisable, however, to check that the results are useful from the outset. 
All channels – app, web, etc. – should converge in a shared tracking tool. Companies should configure corresponding reports and send these automatically on a weekly basis. They thereby have a simple means of measuring success.

 

Conclusion and tips for entering into personalisation 

Start lean, as we say these days. Define a simple minimum viable product (MVP) that forms a conduit across all involved systems and take a look at whether the solution you have come up with works well.

Consider your existing system infrastructure and look for opportunities to reduce complexity or minimise dependencies. Consider from the start which channels you wish to run personalised advertising on – and never ignore the quantifiability of the results. It will quickly become apparent where central elements are missing or fit well. 

Do not create monoliths; crucial services should be made accessible and be encapsulated in useful units. Only in this way can the system be future-proof and open to further expansion possibilities. Any company that invests well in customer needs, offers an optimal user experience and succeeds in delivering relevant, personalised content has the best prospects for success!

 

 

About Rainer Blumenthal

Rainer Blumenthal is Associate Director in the area of technical project management and software development. In his function, he manages large accounts and takes over the technical team management in the technical department. In the area of Java software development, his core competencies lie in the Digital Experience Platform area (ECMS/DXP) with a specialization in Magnolia CMS.

What can we do for you?

WE LOOK FORWARD TO HEARING FROM YOU.

We are driven by projects that both inspire and challenge us. So if you’re looking for an agency to collaborate and grow with, you’ve found the right partner. Focusing on three primary areas – strategy, creation and technology – to deliver a holistic solution for the Digital Age.

Kontakt

Nicole Ruhl, Marketing & PR
Aperto – An IBM Company
Chausseestr. 5
10115 Berlin
Deutschland
+49 30 283921­-268
presse@aperto.com