Emotion AI

Emotions AI is an AI-driven tool that evaluates the behavior of website or app visitors to determine their emotional needs.

Flagship client SDKs include the Emotion AI feature. This functionality captures visitor data such as page views, clicks, scrolls, and mouse or touch movements over a 30-second period. The collected data is then transmitted to the AB Tasty's Emotions AI api, where it is analyzed to generate a score representing the visitor's emotional needs.

📘

Feature activation

This feature must be enabled on your account to be used. Please contact your TSM to activate it.

Collecting Visitor Events

To initialize visitor data collection, the collectEmotionsAIEvents method should be called from the Visitor instance. Once initialized, the SDK starts collecting data on any touches, clicks, scrolls, mouse or touch movements, and page changes over a 30-second period. This data is then sent to the AB Tasty's EmotionsAI server for analysis, which generates the visitor score. At the end of this process, the visitor score will be stored in local cache.

Getting visitor score

When the Emotion AI feature is activated on a client's account, each time the FetchFlags method is called, the SDK retrieves the visitor score from the local cache. If the score is not available in the cache, the SDK makes an HTTP request to retrieve the score from the AB Tasty. The retrieved visitor score is then included in the visitor context before making the HTTP campaigns call.

General Considerations

  • Data collection occurs over a 30-second period and requires an additional interaction event afterward to conclude the collection and initiate scoring. If there is no interaction data within this timeframe, or no additional interaction or if the process is interrupted—such as by a page refresh, closing the browser page, or quitting the app on mobile—the visitor score generation may fail.
  • If the visitor scoring process fails for any reason, it will restart in the next session. This occurs when the page is refreshed in the browser or when the app is relaunched on mobile devices.
  • For mobile SDKs (iOS, Android, or React Native), it's essential to invoke the collectEmotionsAIEvents method promptly after the SDK has been successfully initialized and promptly after visitor enters the application. Avoid calling this method during splash screen, as user interactions are typically minimal at that stage. Insufficient interaction data may lead to irrelevant scoring or failure to generate a visitor score.
  • Once a visitor's score has been calculated for a given visitorId, it is cached. The score will not be recalculated for the same visitorId. Instead, the cached score will be used in subsequent sessions.
  • No data will be collected when the visitor has not given consent. See visitor consent.
  • No data will be collected when the SDK is in panic mode.

Collecting visitor events schema

This schema describes how and when flagship Sdks collect visitor events and send them to Emotion AI API.


Implementation flow

From the schema above:

  1. Once the app is launched, Flagship SDK fully initialized and Visitor consent obtained, call the _collectEmotionsAIEvents to start collecting events at the most appropriate location of your app.
    We recommend to start the collect as soon as the first most relevant interface is displayed to user ignoring any splash screen. During the EmotionsAI Collect, the SDK will intercept screen interactions like clicks, moves, scrolls, and navigation in background and then send hits to EmotionAI server for scoring. No data will be cached on the device at this time.
  2. Data collection will end automatically after a delay of 30 seconds plus one more additional interaction that will close it. When the collection has been successfully completed, it will be stored into the visitor cache to prevent re-collecting in the next session.
    Collect will fail when there is no interaction during the 30 seconds delay, or when no additional interaction is made afterward, or when the app is killed during this time. In this case a new collect will be possible in the next session.
  3. When the full plan is enabled, The SDK will poll the Emotions AI server to obtain the Visitor score and store it locally in the Visitor cache and its context for further targeting.

Example

Below are examples in different programming environments:

import React from "react";
import { FlagshipProvider, useFsFlag, useFlagship } from "@flagship.io/react-sdk";

const App = () => (
  <>
    <FlagshipProvider
      envId="<ENV_ID>"
      apiKey="<API_KEY>"
      visitorData={{
        id: "<VISITOR_ID>",
        hasConsented: true, 
          context: {
            isVIP: true,
          }
      }}
    >
      <Page/>
    </FlagshipProvider>
  </>
);

const Page = () => {
  
  const { collectEAIEventsAsync } = useFlagship();
  
  useEffect(() => {
    collectEAIEventsAsync(); //Start collecting interaction dataÏ
  }, [collectEAIEventsAsync]);

  return (
    <div>
      <h1>My Page</h1>
    </div>
  );
};
import React from "react";
import { FlagshipProvider, useFsFlag, useFlagship } from "@flagship.io/react-native-sdk";

const App = () => (
  <>
    <FlagshipProvider
      envId="<ENV_ID>"
      apiKey="<API_KEY>"
      visitorData={{
        id: "<VISITOR_ID>",
        hasConsented: true, 
          context: {
            isVIP: true,
          }
      }}
    >
      <View/>
    </FlagshipProvider>
  </>
);

const View = () => {
  
  const { collectEAIEventsAsync } = useFlagship();
  
  useEffect(() => {
    collectEAIEventsAsync("My-view"); //Start collecting interaction data
  }, [collectEAIEventsAsync]);

  return (
    <div>
      <h1>My View</h1>
    </div>
  );
};

class MainActivity : ComponentActivity() {
    override fun onCreate(savedInstanceState: Bundle?) {
        super.onCreate(savedInstanceState)

        CoroutineScope(Dispatchers.Default).async {
            
            Flagship.start(
                application,
                "your_env_id",
                "your_api_key",
                FlagshipConfig.DecisionApi()
            )

        }
        enableEdgeToEdge()
        setContent {
            FlagshipandroidTheme {
                Scaffold(modifier = Modifier.fillMaxSize(), containerColor = Color.Cyan) { innerPadding ->
                    Greeting(
                        name = "Android",
                        modifier = Modifier.padding(innerPadding)
                    )
                }
            }
        }
    }

    override fun onResume() {
        super.onResume()

        Flagship.runOnFlagshipIsInitialized {
            val visitor = Flagship.newVisitor("visitor_1234", true).build()
            visitor.collectEmotionsAIEvents(this@MainActivity3)
        }
    }
}
 
// In AppDelegate
func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]?) -> Bool {
     // Override point for customization after application launch.
   // Start Flagship & Create visitor
        Task {
            try await Flagship.sharedInstance.start(envId: "envId", apiKey: "apiKey")
            let visitor = Flagship.sharedInstance.newVisitor(visitorId: "visitorId", hasConsented: true).withContext(context: ["isVip": true]).build()
        }
       return true
 }

// In your HomeScreen or the from the view whereyou  want to collect the Emotion AI 
// Call collectEmotionsAIEvents function  by giving the main window for your App

Flagship.sharedInstance.sharedVisitor?.collectEmotionsAIEvents(window: window, screenName: "LoginScreen")

Please consult the reference documentation for your specific SDK for more detailed information: