Skip to main content

Unreal Engine Plugin for Logs & Analytics

Overview

An open-source plugin is available to ingest game engine analytics and/or logs data from Unreal Engine games into SparkLogs or any HTTPS endpoint of your choice.

Key use cases:

  • Logs: Ingest and aggregate log data from backend game servers as well as editor environments or dev builds. (Collecting log data from client builds is possible but not recommended, use analytics to understand behavior of your players in production builds.)
  • Analytics: Collect analytic events of various kinds, including session start and end, design, progression, real money purchases, ads, resource grants and sinks, and error or other textual events. A textual reason and custom JSON fields can optionally be associated with each analytics event.

The plugin is open source under a permissive open-source license, and can be used either with the SparkLogs Cloud or any HTTPS endpoint of your choice (e.g., a vector.dev HTTP Server source). Contributions are welcome!

Join the discord to ask questions, provide feedback, or collaborate on enhancements.

Free SparkLogs accounts are available to ingest up to 25 GB/month. That's enough free quota for over 1 million analytics sessions of data every month (assuming each session records 16 events and each event is about 1.5 KiB).

Using the plugin with SparkLogs provides a turn-key solution that scales effortlessly with your game to 100s of millions of active users. The advanced analytics functionality makes it easy to analyze and optimize daily active users (DAU), average session and ad statistics, progression funnels, and other key business outcomes and game metrics.

SparkLogs also provides daily analytics snapshots, user geolocation (country), currency conversion for analytics purchase events, auto extraction of structured fields from your unstructured logs, and more. SparkLogs has no cardinality limits and has no limits on the size and number of custom JSON fields for your logs and analytics events.

Plugin Features

  • Ease of Use: Once configured, the plugin automatically captures logs and analytics events, including built-in analytics events generated by the Unreal Engine. You can record analytics events either from Blueprints or in C++ code using thread-safe static methods. The plugin does not require pre-configuration of any field values or schemas.
  • Performance: Keeping overhead low and not interfering with game performance is top priority. The plugin is stress tested to require no more than 0.1% of CPU even at data volumes 10x higher than average. Additionally, in an extreme event where 10s of megabytes of data are generated every second (should never happen in production under normal circumstances), the plugin will process only a fixed amount of work per second -- this is guaranteed to keep CPU overhead low while queuing data. Transmitted data is efficiently compressed to reduce network bandwidth by ~8x.
  • Scale: Automatically scales from 0 to 100s of millions of daily active users. Schemaless with no limits on custom fields, and no cardinality limits.
  • Reliability: Logs and analytics data are persistently queued until receipt is acknowledged. If a player's local clock is inaccurate, the SparkLogs Cloud will automatically adjust for clock skew.
  • Resiliency: If the game engine crashes, it will attempt to flush to disk any pending logs and events, and then when the game engine is restarted, it will resend any queued data. It will also detect any previously open game engine analytics session that was not closed properly (e.g., due to a crash) and will automatically close that session with a session end time of the last known analytics event for that session.
  • Security: Data is transmitted over HTTPS and requests are authenticated.
  • Flexibility: The plugin supports desktops and mobile platforms. The plugin can be configured to send data to the SparkLogs Cloud or any HTTPS endpoint of your choice.
  • Gradual Roll Out: For games with large user bases, you may want to gradually roll out use of the plugin for testing or QA purposes. To do this, you can configure the plugin to only activate a certain percentage of the time, or you can control when data is shipped based on custom code. You can also use this plugin in parallel with your existing analytics plugin to ease the transition.

Compatibility

Tested to be compatible with Unreal Engine 4.27 through 5.6 for Windows, Mac, Linux, iOS, and Android. The plugin is likely compatible with slightly older and newer engine versions but such support is untested.

Pull requests to add support for other platforms are welcome. The plugin currently requires multithreading support so adding support for single-threaded platforms would be a little more work but possible, so contact us if this is important to you.

Setup and Configuration

Download or clone the plugin code. To install the plugin:

  • For a single project:
    • First, make sure your project has C++ active and is building correctly. (Add a blank C++ class if your project is currently Blueprint-only and compile it in Visual Studio.)
    • Then, copy the downloaded plugin files into a sparklogs subdirectory of your project's Plugins directory. When correctly placed the sparklogs.uplugin file should be in the Plugins/sparklogs/ directory.
    • Then with the UE editor closed, in Windows explorer right click on the .uproject file for your project and select Generate Visual Studio project files.
    • Then open the generated Visual Studio solution and build the project. This will build the plugin.
    • You can now re-open your project in the UE editor, and the plugin should be available.
  • For incorporating into engine builds: copy the downloaded plugin files into a sparklogs subdirectory of the EnginePluginsRuntimeAnalytics subfolder of the engine source and rebuild the engine.

With the plugin now compiled in and available, make sure it shows up in your plugins list in the Unreal Editor for your project under Edit -> Plugins and that it's checked as enabled. You can then edit the plugin settings by going to Edit -> Project Settings -> Plugins -> SparkLogs.

For data ingestion to the SparkLogs Cloud, you will need authentication information for the agent(s) you want to use to ingest the data. You can setup the organizations and agents you want to use for your ingested data in the SparkLogs app. You can optionally configure different authentication credentials for the different build configurations (client, editor, server) or use the same auth for all. You may want to use a different SparkLogs organization for each game you develop, and also have suborganizations for development (editor) and production environments.

When ingesting data to your own HTTPS endpoint setup your HTTPS server to expect and check HTTP Bearer auth as appropriate.

To configure the plugin, either use the Unreal Engine plugin editor menu (Edit menu, Project Settings, Plugins, SparkLogs) or edit DefaultEngine.ini in the [/Script/sparklogs.SparkLogsRuntimeSettings] section (refer to the USparkLogsRuntimeSettings class code in the plugin for value names).

General Configuration

For any build configuration (client, server, editor) where you want to send data, configure the data destination and authentication:

  • If you're using the SparkLogs cloud, then configure the cloud region, agent ID, and agent auth token.
  • If you're sending data to your own HTTP endpoint, then configure the Custom HTTP Endpoint URI and Authorization Header Value as appropriate.

Note that if you don't want to put authentication information in the DefaultEngine.ini file, you can also configure the plugin to disable automatic starting of the shipping engine and then call StartShippingEngine with explicit credentials.

Analytics Configuration

Specify a value for the Analytics Game ID setting that uniquely identifies this game from other games you've shipped. Customize other analytics settings as desired.

You should also activate the plugin to receive Unreal Engine analytics events. To do this, edit DefaultEngine.ini and set:

[Analytics]
ProviderModuleName=SparkLogs

If you are already using another analytics plugin, we recommend running both plugins in parallel for a while. Keep your existing analytics plugin installed and configured, and then use the AnalyticsMulticast plugin to forward all analytics events to all plugins, including SparkLogs:

[Analytics]
ProviderModuleName=AnalyticsMulticast
ProviderModuleNames=SparkLogs,...

If you have properly configured analytics, at game startup you should see a log message similar to:

LogPluginSparkLogs: Analytics collection is active. GameID='...' UserID='...' PlayerID='...'

With the plugin now configured, you should add logic to record analytics events and then verify that data is flowing properly.

Note that while raw analytics events appear immediately in SparkLogs once received, game engine clients will batch and only send analytics events at the end of a session, after 15 minutes, when the game engine exits, or after 128 KiB of data is queued (whichever comes first). Also, daily analytics snapshots are computed once per day shortly after midnight UTC, and include all data up to the previous day.

Logs Configuration

By default, the plugin is configured to capture logs for server and editor build configurations. This can be a convenient way to capture logs from your game servers or development environments. If you want more control over when logs are captured, you can disable automatic starting of the shipping engine and then manually call FsparklogsModule::GetModule().StartShippingEngine(...) when appropriate, and set the OverrideCollectLogs and/or OverrideCollectAnalytics parameters as appropriate.

Once properly configured, logs will automatically be captured and shipped to your configured destination. Ingested log data is available immediately for querying in SparkLogs.

danger

Unless you know exactly what you're doing, do not enable collection of logs for client builds. If you have a large number of users (100k+) it could collect and ingest a large amount of data, and you will either quickly reach your quota and all data ingestion will shutoff, or if your quota is large you could incur higher costs than expected. For client build configurations, use the analytics feature to capture specific actionable events (session, design, progression, purchase, ad, resource, etc).

Analytics Overview

Capturing analytics events allows you to understand how players are behaving so you can optimize for various outcomes (daily activity, playtime, progression, in-app purchases and ads) and for balance (resource grants and sinks), etc.

The plugin provides Blueprints functions and C++ static methods to easily manage sessions, set user tags, and record analytics events.

All analytics events happen within the context of an analytics session, which marks the beginning and end of when a player is actively playing the game. Sessions are globally identified by a unique random session ID. Each analytics event is associated with a session and is of a particular type

Supported event types include session start and end, design, progression, real money purchases, ads, resource grants and sinks, and log/error events.

The events generated by the plugin follow the game engine analytics data model (see example JSON data).

If you're using the plugin with SparkLogs, you can explore and export individual events using LQL in the Explore UI. SparkLogs also provides daily aggregate snapshots, allowing you to analyze trends such as daily active users, playtime statistics, player revenue, and other key metrics.

Session Management

You can call the appropriate functions to start and end analytics sessions at the appropriate time (e.g., when gameplay actually begins after exiting the main menu):

  • UsparklogsAnalytics::StartSession() - starts a new analytics session if one is not already active.
  • UsparklogsAnalytics::EndSession() - ends the currently active analytics session, if one if active.

An analytics session will automatically be started if you attempt to record any analytics event and a session is not already active. Any active analytics session will end if the game engine exits.

Also, on mobile game sessions will automatically end when the app goes to the background, and will automatically resume when the app comes back to the foreground. This behavior can be customized by editing the plugin configuration.

If the game crashes, then on next game startup, the plugin will detect that a previous analytics session was not properly ended and will mark it as ended, using the time of last known recorded analytics event as the time that the previous session ended.

On backend game servers, you can also use analytics sessions to track behavior of your backends, and you can also optionally have the server generate analytics events that are associated with specific client player analytics sessions.

User Tags

You can associate one or more user tags with analytics events. This can be useful for segmenting players based on their behavior, preferences, or other attributes. Tags are stored based on the user segmentation data model.

To manage user tags, you can use the following function:

  • UsparklogsAnalytics::GetUserTags() - retrieves all user tags currently in use.
  • UsparklogsAnalytics::SetUserTags() - sets all user tags until the game engine exits.

User tags are considered dynamic: there will be no user tags when the game engine starts. You should set user tags as early as possible (before the first session begin). You can also change user tags at any time, and any future events will use the new tags.

How to Record Analytics Events

To record analytics events in Unreal Engine, you can use either Blueprints or C++ code.

For Blueprints, search for SparkLogs functions and invoke any of the Record* functions to record an event of the appropriate type.

For C++ code, in your project's *.Build.cs file, make sure the sparklogs module is included in the PublicDependencyModuleNames list:

PublicDependencyModuleNames.AddRange(
new string[] {
...
"sparklogs",
...
}
);

At the top of the file that needs to invoke analytics logic include the appropriate header:

#include "sparklogs.h"

You can use the static methods in the UsparklogsAnalytics class to record analytics events: AddPurchase, AddResource, AddProgression[1-5], AddProgressionArray, AddDesign, AddDesignArray, AddAd, and AddLog. For example:

UsparklogsAnalytics::AddProgression2(EsparklogsAnalyticsProgressionStatus::Start, TEXT("Levels"), TEXT("Tutorial"), TEXT("welcome_screen_popup"));

These static methods have overloaded definitions that optionally allow you to specify a "reason" for the event (in the example above, the reason is welcome_screen_popup) and/or specify a custom JSON object with fields to include with the event.

You can also use the Unreal Engine common analytics functions to record events:

tip

We recommend using the SparkLogs Blueprints functions or the static methods in UsparklogsAnalytics since these provide more event types as well as the ability to record custom fields with each event.

Recording Events on Backend Servers

Even though backend servers are not associated with any one specific player, you can still record analytics events on backend servers. You can either have servers have their own analytics sessions, or you can associate backend server events with a specific player session, or both.

To record analytics events on backend servers with server-specific sessions, decide when analytics sessions should start and end (this could be as simple as when the backend server starts and stops), and then use the same methods as before to record events.

To record analytics events on backend servers that are associated with a specific client-side player analytics session, the client should transmit the information in the FSparkLogsAnalyticsSessionDescriptor struct (obtained from calling UsparklogsAnalytics::GetSessionDescriptor) to the server. Then the server can pass this struct to the various UsparklogsAnalytics::Add* methods. The server should not begin/end analytics sessions with the client session ID, but instead should pass an appropriate session descriptor that specifies how an individual analytics event will be associated with the appropriate client session to the UsparklogsAnalytics methods.

Common Fields for Logs and Analytics

The plugin adds standard fields to all logs and analytics events:

Field NameTypeDescription
timestampTimestampThe timestamp when the event was generated.
appStringThe value of FApp::GetProjectName().
hostnameStringThe hostname of the machine.
pidNumericThe operating system specific process ID of the game.
game_instance_idStringA random ID uniquely identifying this run of the game engine. (Changes every time the game engine starts.) Search on this field to easily find all logs generated by the same game engine process from start to finish.
client_ip_locationStringThis will be added automatically by the SparkLogs Cloud. The two digit ISO country code of the IP address of the machine running the game, if available.

You can also customize additional attribute values to include in all events if you manually start the shipping engine when calling FsparklogsModule::GetModule().StartShippingEngine(...) and passing AdditionalAttributes.

Additionally, for logs events, a severity field is added based on the severity associated with a given Unreal Engine log message.