REAN model to achieve higher conversions through hyper personalisation and recommendations

BangDB implements REAN Model to enable conversion through personalization. It ingests and processes various aspects of users or customers and their other activities to optimize the conversion rate. With the REAN model, e-commerce companies can ingest and analyze the clickstream and all pixel data in real-time to understand the customer and visitors in much more advanced ways. This enables the organizations to personalize the content and recommend the products and solutions in a 1-on-1 manner which leads to much higher conversions and revenue

Introduction

The goal of the document is to lay out the basic building unit indices and KPIs using which we can target customers a lot better. But this is not the end, in fact, it just begins the journey to a “1-on-1 personalization and recommendation” for the organization where the underlying goal is to provide a much-improved customer experience and offer higher values. Once we get what’s there in this document, then we need to use Stream Processing and Even ingestions for ETL, data enrichment, and CEP (complex event processing). Further, we will need to put the Graph structure in place to operate in a highly “context-aware-environment” for personalization and recommendation. Let’s first look into the basic part of the bigger recommendation system here, and then in the next blog we will go into stream processing and Graph

REAN Model is defined as follows.

    • Develop Reach
    • Engage
  • Activate and
  • Nurture model

REAN Model Does Two Things Very Well

  • Firstly, it gives you a very clear indication of the measurement challenges you might have when breaking your strategy down into its component parts.
  • Secondly, it can be used to help you define a measurement strategy. You could develop KPIs around each activity (Reach, Engage, Activate and Nurture) and then combine the metrics as matrices of each other.

From a high level, this is how the REAN model is different components are defined.

REACH

  • Traffic sources
    • Search Engines
    • Ads, Campaign
    • Email, Newsletter
    • Internal links
    • Partner sites
    • YouTube, Video, Video banners
    • Blog
    • PR
  • SEM
  • SEO
  • Brand Awareness – traffic coming from logo, company names, product names,
  • Seeding – from opinion leaders, reviews, articles

ENGAGE

  • Shopping carts
  • Self-service process – ex; SaaS product sign up etc
  • Any creatives
  • User segmentation based on behavior
    • Session length and depth      
    • Separate users based on their likes and dislikes [ based on session length and depth]
  • Click depth – average click depth and corresponding user’s segment
  • Duration – length of time spent on a website
  • Offline engagements – relevant for offline stores, etc.

ACTIVATION

  • Also, can be interpreted as a conversion
  • Purchases
  • Downloads of software or documents
  • Activation is typically what reach, engagement, and nurture KPIs are measured against.
    • B2B, B2C, Media, Customer service, Branding

NURTURE

  • CRM
  • Follow-ups – emails, community
  • Most importantly
    • Personalization & customization
    • When users, customers are back, how we interact with them
    • Cookie, user management, etc
    • Recommendations
  • Recency – Measure of time elapsed between your visitor’s last visit and his/her current one

INDEXES

  • Click depth index [# of the session with more than n page views / #total sessions]
  • Recency index [ # of the session with more than n page views in last m weeks / #total sessions]
  • Duration index [ #num of sessions more than n min / #total sessions]
  • Brand index [ # of sessions originated directly (no ref URL) / total sessions]
  • Feedback index
  • Interaction index [ # of sessions where visitor completed any or tracked activity / #total sessions]
  • Loyalty index = ∑(Ci+Ri+Di+Bi+Fi+Ii) per visitors and select top k
  • Subscription index [ #num of visitors with content subscribers / #total visitors]
  • Content page view index [ count of content page views/total page views]
  • Internal banner index [ banner clicks / total clicks]
  • Content consumption index [ #page views per content/ #page views]
  • System perf idx [ #views from per system / #page views]

 

 

 

MetricVisitor AcquisitionConversion to opportunity
Conversion to sale


Customer Retention & Growth

Tracking metricsUnique visitors
New visitors
Opportunity volumeSales volumeE-mail list quality Transaction
churn rate
Performance drivers
(diagnostic)
Bounce rate
Conversion
rate New visit
Macro-conversion rate to
opportunity to micro
conversion rate
Conversion rate to sale
Email
conversion rate
Active customers % (site & email
active)
Repeat conversion rate
Customer-centric KPIsCost per click,
per
sale Brand
awareness
Cost per opportunity or
lead  
Cost per saleLifetime value Customer loyalty
index
Business value
KPIs
Audience sharesTotal orderTotal salesRetained sales growth and
volume
StrategyOnline and
offline
targeting
and reach
strategy
Lead generation strategyOnline sales
generation
strategy
Retention, customer growth
TacticsContinuous
campaign, ads,
communications
Personalization &
customization
TargetingTargeting Churn rate etc

KPI

There are the following types of Web Analytics Metrics

  • Count
  • Ratios
  • KPIs – either count or ratio
  • Dimension – segments
  • Aggregates
  • Etc.

Business questions that we need to answer through KPIs

  1. What is the best source of traffic in terms of volumes and sales?
  2. Where are the visitors coming from? [ top k places]
  3. Which channel is most productive? [ top k channels]
  4. Which channels are overlapping?
  5. Which landing page converts best [ top k landing pages]
  6. Do registered users buy more?
  7. Most searched pages [ top pages]
  8. How many downloads?
  9. What’s the value of download for different items [ top downloads by value]?
  10. Avg response time for lead response?
  11. Internal search stats
  12. How engaged are our visitors?
  13. What are the Top paths to our website?
  14. How are visitors finding the site?
  15. What is the cost per conversion (per campaign?)
  16. Users by location
  17. How many people don’t get through a shopping cart?
  18. What are the search keywords?

Page bounce rate – left from a landing page

  • Hourly
  • 15% deviation – Alert

Page time index – time spent on the page / total time spent on the site

  • Hourly
  • 15% deviation – Alert

Segmentations

  • By paid traffic [ Reach] – campaign, ads, banners, etc.
  • Unpaid traffic [ Engage]
  • By location [ Engage]
  • By search phrase or keyword [ Engage]
  • By site pages or behaviors [ Engage]
  • By system vars [ device, browser, etc.] [ Engage ]
  • By conversion
  • By loyalty – repeat visitors, registered, recency, etc.

Attributes for basic segmentation

  • Visits
  • % Add to cart
  • Conversion rate [ #confirmed conversion / # total visits]
  • Engagement conversion rate [#confirmed conversion / # total engaged visitors]
  • Marketing cost
  • Cost per visit (CPV)    
  • Visitor volume ratio [ num of visitors from a source / total visitor]
  • Video engagement rate [ count of num of times video played / num of visitors to the page]
  • Cost per add to cart
  • Customers
  • Average cart value
  • Shopping cart abandonment rate
  • Page time index
  • Visitor engagement index
  • Content page view index [ count of content page views/total page views]
  • Internal banner index [ banner clicks / total clicks]
  • Content consumption index [ #page views per content/ #page views]
  • System perf idx [ #views from per system / #page views]
  • Cost per acquisition [ cost of referring source / num of conversions]
  • Sales

Attributes for Behavior segmentation KPIs

  • Page views per session
  • Avg session time

Nurture rate

  • Repeat visitor index [ # of repeat visitors / # of visits]
  • Email perf index

BangDB is designed to ingest and process high-speed data to extract the intelligence and apply them to ongoing operations for better operational value. BangDB comes with Stream processing, AI, and Graph processing with unstructured data analysis in real-time. Take BangDB free of cost and start building applications

Relevant Readings

How to mitigate security risk using BangDB

Security risk is everywhere and it has been growing rapidly while we try to mitigate security risk at the same time. The fraudsters are always a step ahead of the curve and come up with new ideas for attacks while we are busy handling the older ones. To mitigate security risk, it requires all of us to innovate faster and prepare in a much more advance and modern manner. Most of the time we keep solving the older problems due to the enormity of challenges here and forget about preparing for the potential upcoming attacks. The sheer definition of the problem is not available most of the time, the tools in the market are sparse and siloed, and the concepts are available but implementations are limited. The core of the solution lies in the ability to scan every single event in context and use modern methods to not only do forensic but be predictive to avoid the repercussions

The cybersecurity threats have changed in three crucial ways in the recent past:

  • MOTIVE: In the past viruses were introduced by curious programmers. Today cyberattacks are a result of a well-executed plan by trained militaries in support of cyber warfare.
  • SPEED: The potential rate at which the attack spreads has also increased and can affect computers all over the globe in seconds.
  • IMPACT: the potential impact has increased manifold due to the wide penetration of the internet

Challenges

Continuous and relentless: Threats may come from any place, any system, and the most unlikely of places. Therefore, we must capture and analyze all data (and not just samples). Hence stream processing in a continuous manner is critical where all events/data are analyzed with as low latency as possible. Most of the tools in the market are batch processing tools, they miss the pattern at the boundaries of the batches, therefore not suitable for such use cases

Non-atomic in nature: Threats may not necessarily be atomic in nature; it may arrive in small packets over a period from many different sources. Therefore, by just looking at a single packet or event we can’t perceive the thread. We must analyze these arriving data packets in the state-managed system with a continuously moving window that can see the pattern over a period. Also, we need to link data points to capture the essence and context

Unpredictability: Few threats may have known or constant signatures, which we could identify using some computations in the deterministic and absolute manner. However, several threats are extremely hard to be captured in this way as they are designed to miss the regular known or anticipated structure. Therefore, we must use AI to predict some of these scenarios continuously on stream data

High-speed processing: The speed of data is so high in some cases that existing tools in the market would sample and then process. We know that this is too open and risky. We must capture and process all data. This means we must have a system that has very high performance in reading and writing. The high throughput data store is desired in such cases

Linear scale: Data volume would be high as we need to process and store all data. A large scalable system would be needed to achieve this. We need a linear scale to ensure the data ingestion and processing work uninterrupted while system scales

What BangDB does do? – It enables a Predictive instead of a Forensic approach with high speed in a scalable manner to mitigate security risk

BangDB ingests and processes the security telemetry information at extremely high speed to make it easily accessible for advanced computation and analytics. It further leverages the following to achieve a predictive vs forensic approach

  • Advanced statistical and data science models for high-speed anomaly detection
  • Real-time ingestion and stream processing to enable continuous threat analysis
  • Machine learning models integrated with stream for predictive threat detection
  • Graph with stream for interlinking of data points for richer pattern detection
  • Handle all kinds of data in a single place, text, images, videos in a joined manner

What are the typical steps that BangDB takes to tackle this?

STEP1: Advanced Threat Detection

Need to leverage BANGDB to combine and contextualize incidents from multiple big data disparate sources for continuous near real-time streaming detection, capturing incidents that are often missed in batch-based technologies. 

STEP2: Link data

Enrich data with a Graph model to capture the “context” rather than just isolated events which do not provide enough information. Further, integrate Graph with Stream processing such that the linking of data and context capturing in automated and continuous

STEP3: Complex event processing

Find anomalies, and patterns using complex event processing (CEP). This allows users to define a certain pattern that is so complex in nature that can’t be run on typical RDBMS or other databases. The pattern identified here are absolute in nature and with 100% confidence

STEP4: Predict and don’t depend on forensics as much as possible

Artificial intelligence enables the identification of never seen threats, malware, and infiltration techniques. Using AI, build a comprehensive security score leveraging behavioral modeling and stochastic anomaly detection. Kill chain incidents are prioritized based on potential impact, key users, and critical assets.

STEP5: Take automated action

When an anomaly or pattern is detected, take action in an automated manner. This means timely action which could potentially result in saving time and resources and in many cases avoiding the situation itself

STEP6: Track threat propagation

Leverage BangDB’s ability to ingest and analyze immense amounts of data to track threats and their propagation across time and space through a near real-time relational-graph view of the entire network

STEP7: Visual threat hunting

Sophisticated threat hunting tools within the Security Intelligence platform to allow the SOC staff to effectively hunt, validate and remediate potential threat incidents surfaced by the product. Analysts can self-assemble new threat hunting workflows using building block modules for ingestion, enrichment, and analytics on a security playbook interface.

Conclusion

In the end, no amount of effort and tools can make us completely insulated from these security threats, there is no complete immunity that we can develop for such things. However, we can at best be prepared for such attacks and try to avoid them and mitigate security risk as much as possible. And in case of some attacks, we can try and minimize the damage. An additional set of tools would never hurt, they can probably add more value and make the situation better, hence it is recommended to try BangDB to make the castle bit more impregnable

Download BangDB for free and get started, BangDB is one of the “fastest databases” in the market. It performs 2X+ when compared to some of the most popular databases like MongoDB, Couchbase, or Redis.

Please see more related articles on the following;

 

Predictive real-time data platform to boost e-commerce sales

E-commerce business needs to collect data from various sources, analyze them in real-time and gain insights to understand the visitor’s behavior and patterns which will allow the company to serve the customers in contextual and better ways to improve the conversion rate. A real-time data platform is needed of the hour which can combine stream analytics with Graph and AI to enable predictive analytics for better personalization for the users which significantly improves the e-commerce sales by 2X or even more. Therefore predictive real-time data platform is needed to boost the e-commerce sales

A real-time and predictive data platform for boosting e-commerce sales by visitor analysis is a need of the hour

Read a good article on this here to get more info about it

Some of the general facts (statistical) which relate to e-commerce sales are following

  • Based on survey reports, 45% of online shoppers are more likely to shop on a site that offers personalized content/recommendations
  • According to a report by Gartner, personalization helps in increasing the profits by 15%
  • The majority of the data (more than 60%) is not even captured and analyzed for visitor or customer analytics
  • Less than 20% of data is captured in true real-time, which diminishes the potential of relevant and contextual personalized engagement with the visitors and hence leads to scoring as well

To boost sales, e-commerce is looking to answer some of the following questions in general

  • How to develop personalized real-time engagement and messages, content
  • How to engage with the visitors and customers on a 1-on-1 basis for higher CR
  • How to identify and leverage purchasing patterns
  • What the entire consumer cycle looks like
  • Ways to improve promotional initiatives
  • How to make the customer and the customer experience the focus of marketing strategies – better lead score and reasons for the score
  • How to identify your customers’ switches between each channel and connect their movements and interactions

The businesses typically seek to predict the following for predictive analysis

  • Personalized content in a 1-0n-1 manner for better next steps or conversion
  • Exactly which customers are likely to return, their LTVs
  • After what length of time, they are likely to make the purchase
  • What other products these customers may also buy at that time
  • How often they will make repeat purchases of those refills

What are some common challenges e-commerce businesses face?

  • Understanding who is “Visitor” and “Potential Buyer”
  • Relationships between different entities and the context
  • Nurturing the existing prospects
  • Personalization
  • Calculating the Lifetime Value
  • Understanding the buyers’ behavior
  • Cart Abandonment
  • Customer Churn

So, how can e-commerce businesses tackle the above challenges?

Predictive Analytics encloses a variety of techniques from data mining, predictive modeling, and machine learning to analyze current and historical data and make predictions about future events and boost thier e-commerce sales.

With Predictive analytics, e-commerce businesses can do the following

  • Improve Lead scoring
  • Increase e-commerce sales
  • Increase Customer retention rates
  • Provide personalized campaigns for each customer
  • Accurately predict and increase CLV
  • Utilize Behavioral Analytics to analyze buyers’ behavior
  • Reduce cart abandonment rates
  • Use Pattern recognition to take actions that prevent Customer Churn

Following is a brief list of use cases that can be enabled on BangDB

A. Real-time visitor scoring for personalization and lead generation for higher conversion

  1. Predictive real-time visitor behavior analysis and scoring for personalized offering/targeting for a much-improved conversion rate. The potential increase in CR or expected biz impact could be 2X or more if implemented and scaled well
  2. Faster, contextual, and more relevant lead generation for higher conversion
  3. Personalized content, offerings, pricings, for visitors 1 on 1 basis, leads to much deeper engagement and higher conversion
  4. Projecting much relevant and usable LTV for every single user/visitor could lead to better decision making for personalization or targeting or offering
  5. Inventory prediction for different product/versions/offerings for better operation optimization

B. Improve engagement

  1. Personalized interaction and engagement with the customers
  2. Shopper’s Next Best Action
  3. Recommendations about relevant products based on shopping and viewing behavior
  4. Tailored website experience

C. Better target promotions

Collate data from other sources (demographics, market size, response rates, geography, etc.) and past campaigns to assess the potential success of the next campaign. Throw the right campaign to the right users

D. Optimized pricing

Predictive pricing analytics looks a historical product pricing, customer interest, competitor pricing, inventory, and margin targets to deliver optimal prices in real-time that deliver maximum profits. In Amazon’s marketplace, for example, sellers who use algorithmic pricing benefit from better visibility, e-commerce sales, and customer feedback.

E. Predictive inventory management

Being overstocked and out of stock has forever been a problem for retailers but predictive analytics allows for smarter inventory management. Sophisticated solutions can take into account existing promotions, markdowns, and allocation between multiple stores to deliver accurate forecasts about demand and allow retailers to allocate the right products to the right place and allocate funds to the most desirable products with the greatest profit potential.

F. Prompt interactive shopping

Interactive shopping aims for customer loyalty. Integration of an online logistics platform helps maintain end-to-end visibility of purchases and orders, and business intelligence software helps process customer transaction data. It also enables retailers to offer multiple delivery options and can prompt customers for additional purchases based on their buying patterns. Consistent customer service, coupled with technology, can greatly increase customer reliability.

Data mining software enables businesses to be more proactive and make knowledge-driven decisions by harnessing automated and prospective data. It helps retailers understand the needs and interests of their customers in real-time. Further, it identifies customer keywords, which can be analyzed to identify potential areas of investment and cost-cutting.

Challenges and Gaps in the market

Challenges

  1. Need to capture all kinds of data, across multiple channels, not just a limited set of data
  2. Need to capture all data truly in a seamless and real-time manner
  3. Store different entities and their relationships in a graph structure and allow rich queries
  4. Need to auto-refresh and retrain the scoring model for relevant and higher efficacy
  5. Need to scale for high speed, high volume of data across multiple levels/ channels
  6. Need to have full control over the deployment and data
  7. Need to have the ability to add and extend the solution easily and speedily in different contexts or domains as required

Gaps with the existing systems in the market

  1. The majority of systems (GA, Omniture, etc.) can ingest a limited set of data. It’s virtually impossible to ingest other related data into the system for a better scoring model. Also, with these systems, it’s difficult to extend the ingestion mechanism for a custom set of data, coming from totally different data sources than just the clickstream. Therefore, there is a need for a system that can ingest heterogenous custom data along with typical CS data for better results and higher efficiency
  2. Most of the systems ingest data with latency not acceptable from the solution perspective. Forex; GA allows a limited set of data ingestion in real-time, the majority of data come with high latency. Omniture also has latency which is not acceptable to certain scenarios for the use cases.  Therefore, there is a need for true real-time data ingestion and processing system/platform
  3. All the systems come with the pre-loaded model(s) which are trained outside the actual processing system. This is hugely limiting from the AutoML perspective where the models could be trained and improved as it ingests more and more data. Also, finding the efficacy of the model is limiting which may result in poor and non-relevant prediction. Therefore, there is a need to have an AI system natively integrated with the analytic scoring platform
  4. As we wish to deploy the system for various locales, different verticals, websites, or companies, the system must scale well. The speed and volume of data coupled with model preparation, training, deployment, etc. make it very difficult for such a system to scale well. It takes many weeks and months just to prepare and integrate the system with the new set of data sources. Software deployments, library configurations, infrastructure provisioning, training and testing of models, versioning of models, and other large files, all of these create a huge block in terms of scaling the system. Therefore, there is a need to have a platform that hides all these complexities and provides a simple mechanism to scale linearly for a higher volume of data, more num of websites, locales, or simply for a larger number of use cases as things move forward.
  5. Most of the system acts as a black box allowing lesser control on deployment and access to data in a larger sense. This results in brittle solutioning and faster development of use cases. Better access to
  6. Most of the systems in the market won’t have “stream”, “graph”, “ML” and “NoSQL” in the same place. Integration takes lots of time, resources and is sometimes not feasible at all
  7. Also, it provides huge restrictions in terms of dealing with ML since the models and their metadata are often abstracted. More often than not, we might need to upload pre-baked models or model creation logic or file to leverage existing code. Therefore, we need a system that allows us to have greater control of various processes along with the ability to reuse and extend already existing knowledge and artifacts

BangDB’s offering

BangDB platform is designed and developed to address the above gaps and challenges.

  1. Captures all kinds of data for visitors
  2. Clickstream, pixel data, tags, etc.
  3. Website specific data
  4. Any other data that may be useful/required
  5. Existing data
  6. Retailers’ data, external data
  7. Any other infrastructure or system data as required

Captures all data in real-time

Captures all data in real-time, as opposed to GS which captures only a small fraction of data in real-time. This limits the scoring efficacy as real-time data is the basis for proper analysis. Omniture captures most of the data, but they are available for analysis in a few minutes rather than in a few milliseconds. Proper personalization or any action is best taken as soon as possible, not after a few minutes or hours

Accelerated time to market

BangDB comes with a platform along with a ready solution that implements the use cases as needed and has the majority of the plumbing in place. Further, it has the built-in KPIs, models, actions, visualizations, etc. which are ready from day 1. We need to just configure the system, add more data points, fix the API hooks, etc., set the model training/retraining processes which are in contrast with many other systems where they may take several weeks or even months to just get started

Scales well across multiple dimensions, in a simple manner

Several IPs for high performance and cost-effective methods to deal with a high volume of fast-moving data. The platform has a built-in IO layer for improved, flexible, and faster data processing. Convergence allows the system to scale linearly as required in an uninterrupted manner

  • Integrated streaming system to ingest all kinds of data as required for analysis in real-time. Build your apps/solutions or extend the existing ones as needed by just using the UI and not doing coding etc.
  • Integrated machine learning system for faster, simpler, and automated model training, testing, versioning, and deployment

The platform comes with AI natively integrated, which allows us to get the models trained and retrained frequently as more and more data arrives. It starts producing output from the model within a week and as it moves forward it keeps improving the model and its efficacy. It also measures its efficacy and tunes/retunes as needed for higher performance.

Install BangDB today and get started

To check out BangDB, go ahead download it for free

Quickly get started here

Checkout the BangDB benchmark