Was Kunden von Business-Transformation-Lösungen erwarten

Unabhängig davon, in welcher Branche Sie tätig sind, steht die Kundenzufriedenheit an erster Stelle. Denn ohne Kunden kann es kein Geschäft geben! Daher hat die Kundenzufriedenheit für viele Unternehmen oberste Priorität. So drehen sich zahlreiche Studien, Artikel und Berichte um die Frage, was Kunden erwarten und wie Unternehmen diese Erwartungen erfüllen können. 


Read this article in English:

What Customers Want from Business Transformation Solutions

 


Aus Sicht des Prozessmanagements gibt es bereits eine Technologie, um die Interaktion von Kunden mit Ihrem Unternehmen zu verstehen: Customer Journey Mapping (CJM).). Mithilfe von Customer Journey Mapping können Sie genau nachvollziehen, wie Kunden mit Ihrem Unternehmen agieren und wie ihre Erlebnisse dabei sind. Es hilft bei der Beantwortung von Fragen wie: 

  • Haben Kunden ein positives oder negatives Gefühl, wenn sie mit bestimmten Prozessen Ihres Unternehmens in Berührung kommen? 
  • Gibt es Punkte, an denen Kunden nicht weiterkommen oder weiter ziehen oder weitere Informationen wünschen? 
  • Wie reagieren die Kunden tatsächlich auf Ihre Kundenserviceoptionen?

Neben der internen Beantwortung dieser Fragen gibt es jedoch noch ein wichtigeres und zugleich ganz einfaches Instrument, die Kundenzufriedenheit und -bindung zu verbessern: Fragen Sie einfach Ihre Kunden! 

 

Was Business-Transformation-Kunden erwarten

Dank Technologie können Unternehmen ihre Kunden einfacher als je zuvor direkt zu Produkten und Dienstleistungen befragen. Dabei besteht jedoch die Gefahr, dass Kunden zu häufig kontaktiert werden und sie genau das Gegenteil von dem erreichen, was sie wollten. Darüber hinaus können Einschränkungen bei der Erfassung und Verwendung von Kundendaten dazu führen, dass die tatsächliche Kontaktaufnahme mit Kunden zu einer Herausforderung wird.

Eine Möglichkeit, um diese Hürden zu überwinden, sind online verfügbare technische Bewertungsservices. Diese Websites bieten eine Fülle von Informationen darüber, was Kunden in den unterschiedlichsten Branchen schätzen. Signavio verwendet beispielsweise IT Central Station, um Aufrufe von Kunden zu Business-Transformation-Software zu verfolgen. Wenn wir diese Aufrufe in ihrer Gesamtheit betrachten, fällt auf, dass immer wieder zwei Themen auftauchen: Zusammenarbeit und Benutzerfreundlichkeit.

Dies spiegelt sich auch in den Kommentaren von Benutzern wider:

  • „Aus meiner Sicht bietet der Collaboration Hub definitiv die wertvollsten Funktionen. Immer mehr Benutzer nutzen ihn und machen sich damit vertraut.“
  • „Nach meiner Erfahrung ist eine der besten Funktionen von Signavio der Collaboration Hub, über den Benutzer aus verschiedenen Abteilungen ständig auf ihr TO-BE-Prozessdesign zugreifen können.“
  • „Als wir nach Lösungen suchten, war die Benutzerfreundlichkeit eines der wichtigsten Kriterien. Die Benutzerfreundlichkeit hatte einen großen Einfluss auf die Akzeptanz in unserer Organisation. Wenn die Mitarbeiter mit der Lösung ihre Probleme gehabt hätten, dann hätten sie sie nicht benutzt. Ich würde sagen, die Benutzerfreundlichkeit ist ein ziemlich wichtiger Faktor bei der Entscheidung für eine Lösung.“
  • „Eines der wichtigsten Merkmale der Lösung ist die Benutzerfreundlichkeit. Eine wirklich gute Investition. Mitarbeiter wollen Tools, die sie einfach und sofort nutzen können.“
  • „Die Oberfläche ist sehr intuitiv. Ich modelliere viele Prozesse und mit diesem Tool ist es für mich ganz einfach.“

Ein letzter Tipp

Um die Bedürfnisse Ihrer Kunden erfüllen und so eine dauerhafte Kundenbeziehung aufbauen zu können, müssen Sie Ihre Kunden verstehen. Und wie so oft spielen auch hier Gefühle eine große Rolle. 

Gleiches gilt für die Business Transformation, wie der Lead Business Analyst eines Medienunternehmens mit über 10.000 Mitarbeitern betonte: „Sie haben ein Gefühl dafür, was Sie tun möchten, und dann schauen Sie sich die verfügbaren Tools an und können Ihre Entscheidung umso leichter treffen.“

Sind Sie bereit, sich für die richtige Business-Transformation-Lösung zu entscheiden? Dann registrieren Sie sich noch heute für eine kostenlose 30-Tage-Testversion bei Signavio.

Data Science Blog Newsletter

The Data Science Blog is one of the most successful multi-author blog platform around data science and related topics like Data Engineering, Business Intelligence and Process Mining.

In a three-monthly newsletter we summarize the most important articles, tutorials and news and add exclusive posts for you.

Subscribe to the newsletter and stay informed about which tools, methods and ideas are currently shaping the data science scene!



Data Privacy Note:

No, we will not sell your data. We will use it for one purpose and one purpose only, to keep you up-to-date via your e-mail adress!

Multi-touch attribution: A data-driven approach

This is the first article of article series Getting started with the top eCommerce use cases.

What is Multi-touch attribution?

Customers shopping behavior has changed drastically when it comes to online shopping, as nowadays, customer likes to do a thorough market research about a product before making a purchase. This makes it really hard for marketers to correctly determine the contribution for each marketing channel to which a customer was exposed to. The path a customer takes from his first search to the purchase is known as a Customer Journey and this path consists of multiple marketing channels or touchpoints. Therefore, it is highly important to distribute the budget between these channels to maximize return. This problem is known as multi-touch attribution problem and the right attribution model helps to steer the marketing budget efficiently. Multi-touch attribution problem is well known among marketers. You might be thinking that if this is a well known problem then there must be an algorithm out there to deal with this. Well, there are some traditional models  but every model has its own limitation which will be discussed in the next section.

Traditional attribution models

Most of the eCommerce companies have a performance marketing department to make sure that the marketing budget is spent in an agile way. There are multiple heuristics attribution models pre-existing in google analytics however there are several issues with each one of them. These models are:

First touch attribution model

100% credit is given to the first channel as it is considered that the first marketing channel was responsible for the purchase.

Figure 1: First touch attribution model

Last touch attribution model

100% credit is given to the last channel as it is considered that the first marketing channel was responsible for the purchase.

Figure 2: Last touch attribution model

Linear-touch attribution model

In this attribution model, equal credit is given to all the marketing channels present in customer journey as it is considered that each channel is equally responsible for the purchase.

Figure 3: Linear attribution model

U-shaped or Bath tub attribution model

This is most common in eCommerce companies, this model assigns 40% to first and last touch and 20% is equally distributed among the rest.

Figure 4: Bathtub or U-shape attribution model

Data driven attribution models

Traditional attribution models follows somewhat a naive approach to assign credit to one or all the marketing channels involved. As it is not so easy for all the companies to take one of these models and implement it. There are a lot of challenges that comes with multi-touch attribution problem like customer journey duration, overestimation of branded channels, vouchers and cross-platform issue, etc.

Switching from traditional models to data-driven models gives us more flexibility and more insights as the major part here is defining some rules to prepare the data that fits your business. These rules can be defined by performing an ad hoc analysis of customer journeys. In the next section, I will discuss about Markov chain concept as an attribution model.

Markov chains

Markov chains concepts revolves around probability. For attribution problem, every customer journey can be seen as a chain(set of marketing channels) which will compute a markov graph as illustrated in figure 5. Every channel here is represented as a vertex and the edges represent the probability of hopping from one channel to another. There will be an another detailed article, explaining the concept behind different data-driven attribution models and how to apply them.

Figure 5: Markov chain example

Challenges during the Implementation

Transitioning from a traditional attribution models to a data-driven one, may sound exciting but the implementation is rather challenging as there are several issues which can not be resolved just by changing the type of model. Before its implementation, the marketers should perform a customer journey analysis to gain some insights about their customers and try to find out/perform:

  1. Length of customer journey.
  2. On an average how many branded and non branded channels (distinct and non-distinct) in a typical customer journey?
  3. Identify most upper funnel and lower funnel channels.
  4. Voucher analysis: within branded and non-branded channels.

When you are done with the analysis and able to answer all of the above questions, the next step would be to define some rules in order to handle the user data according to your business needs. Some of the issues during the implementation are discussed below along with their solution.

Customer journey duration

Assuming that you are a retailer, let’s try to understand this issue with an example. In May 2016, your company started a Fb advertising campaign for a particular product category which “attracted” a lot of customers including Chris. He saw your Fb ad while working in the office and clicked on it, which took him to your website. As soon as he registered on your website, his boss called him (probably because he was on Fb while working), he closed everything and went for the meeting. After coming back, he started working and completely forgot about your ad or products. After a few days, he received an email with some offers of your products which also he ignored until he saw an ad again on TV in Jan 2019 (after 3 years). At this moment, he started doing his research about your products and finally bought one of your products from some Instagram campaign. It took Chris almost 3 years to make his first purchase.

Figure 6: Chris journey

Now, take a minute and think, if you analyse the entire journey of customers like Chris, you would realize that you are still assigning some of the credit to the touchpoints that happened 3 years ago. This can be solved by using an attribution window. Figure 6 illustrates that 83% of the customers are making a purchase within 30 days which means the attribution window here could be 30 days. In simple words, it is safe to remove the touchpoints that happens after 30 days of purchase. This parameter can also be changed to 45 days or 60 days, depending on the use case.

Figure 7: Length of customer journey

Removal of direct marketing channel

A well known issue that every marketing analyst is aware of is, customers who are already aware of the brand usually comes to the website directly. This leads to overestimation of direct channel and branded channels start getting more credit. In this case, you can set a threshold (say 7 days) and remove these branded channels from customer journey.

Figure 8: Removal of branded channels

Cross platform problem

If some of your customers are using different devices to explore your products and you are not able to track them then it will make retargeting really difficult. In a perfect world these customers belong to same journey and if these can’t be combined then, except one, other paths would be considered as “non-converting path”. For attribution problem device could be thought of as a touchpoint to include in the path but to be able to track these customers across all devices would still be challenging. A brief introduction to deterministic and probabilistic ways of cross device tracking can be found here.

Figure 9: Cross platform clash

How to account for Vouchers?

To better account for vouchers, it can be added as a ‘dummy’ touchpoint of the type of voucher (CRM,Social media, Affiliate or Pricing etc.) used. In our case, we tried to add these vouchers as first touchpoint and also as a last touchpoint but no significant difference was found. Also, if the marketing channel of which the voucher was used was already in the path, the dummy touchpoint was not added.

Figure 10: Addition of Voucher as a touchpoint

Let me know in comments if you would like to add something or if you have a different perspective about this use case.

Top influential AI skills to target in 2020

Artificial intelligence in 2020: a trustable year!

The AI market is deemed to reach USD70 billion in the present year, thus causing a drastic effect on the government market, consumer, enterprise globally. 

With market uproars, it is certain obstacles are bound to intercept. However, the effects of artificial intelligence are poised to have huge potential in democratizing expensive services, or boost poor customer service, or assist during medical breakthroughs, and even lightened the work of the overburdened workforce. 

If you’re a tech optimist who believes in creating a world where man and machine can come closer and work together with humans, then you better need to possess mandatory AI skills. 

Based on this report, “2020 Workplace Learning Trends Report: The Skills of the Future,” you will come across how AI is reshaping the world and what are the skills that are a must-know for upcoming AI professionals. 

The report also states that the investment funds that are managed by AI are accounted for 35% of the stock market in America.

Finance machines will rise says a recent article in ‘The Economist.’ 

Trending by the industry’s insights and job trends, these are the skills AI professionals need to master in 2020:

  • Machine learning 

People often use the terms artificial intelligence and machine learning interchangeably, both being entirely different from each other. There’s a lot of confusion between both these terms, however, let us briefly understand what exactly is machine learning. 

Machine learning uses algorithms that obtain knowledge and skill through experience and without human intervention. It relies on big data sets that remind the data to find common patterns. 

For instance, if you provide machine learning programs with a lot of data on skin conditions and tell them what these conditions signify, these algorithms can easily mine the images (data) and help in analyzing the skin conditions even in the future. 

Now the algorithms will compare the images with the previous image data and try to identify the pattern that exists between them that have similar kind of patterns. 

However, if the algorithm is given a new image of a skin condition and the condition is unknown in the future as well. The algorithm will take the image to analyze it with the past and the present situation, thus the prediction of the condition will remain improper since one needs to feed new data for the algorithm to learn in order to predict what the condition is. However, AI will only learn by obtaining knowledge and learn how to apply these knowledges. Artificial intelligence helps to train computers aiming at a result that could provide a better outcome than a human can do. 

 

  • Python 

As the phrase is used, ‘orange is the new black’ so are AI and ML becoming the new black of the IT industry. With the extensive expansion of volumes of data, machine learning and artificial intelligence are used progressively for processing as well as analysis. To be honest, a human brain can function and analyze huge amounts of data but a human brain is restricted by limitations and can only contain a certain amount of processing or analyzing. 

AI in that aspect has a huge capacity and requires no limitation. 

Now Python plays a major impact on AI and machine learning as we undergo an upsurge for AI engineers. 

According to the latest trend search on Indeed, Python is said to be the most popular language for AI and ML, as stated by Jean Francois Puget, from the machine learning department in IBM. 

 

The how’s and why’s?

  • Python offers a low entry barrier for data scientists to effectively use Python without wasting much effort in learning the language. 
  • It has a great choice of library ecosystem, no wonder why everybody loves Python. For instance, Pandas for producing high-level data structure or data analysis, Matplotlib to create 2D plots and histograms, or Keras that is widely used for deep learning. 
  • Python as a language is flexible and is a great choice for machine learning. AI professionals can easily use Python along with other programming languages to achieve their target. 
  • Python offers easy readability for every developer, thus making it easy to understand the codes of their senior, change or develop a new one whenever needed. 

React (web)

If you’re a web developer thriving to enter the AI world, here’s a great chance for you. You can now build sweet AI apps using React.js. These offers web developers a new platform to bridge the gap between web developers and professionals who are getting trained in AI skills. 

A web developer can now easily build apps leveraging artificial intelligence that can learn from experiences or learns to react to the user’s inputs like facial expression, etc. 

Angular

If one isn’t aware, Google AI is build using Angular. Building a chatbot from scratch using a Dialog flow conversation platform, previously known as API.ai can be challenging. NLP (natural language processing) could be tough to deal with in machine learning. 

Docker 

Docker is now used in every field of the software industry. Confused with the term? Well, Docker is nothing but a DevOps tool. 

Docker can also be called as an exploiter tool exploiting operating-system-level virtualization that basically helps in developing and delivering software in packages that are called containers. 

Though this may sound complicated, you need to simply know that Docker is a complete environment that offers you a platform to build as well as deploy software. 

In a nutshell, one may say that Docker can be employed for varied phases of the machine learning development cycle like data aggregation, data-gathering, model training, predictive analysis, and application deployment, etc. 

Tech professionals such as DevOps engineers and AI engineers will need to supercharge their skill set in 2020. Till the time this knowledge gap persists, we will continue to see talent shortage in the job market.