Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1,165 changes: 631 additions & 534 deletions composer.lock

Large diffs are not rendered by default.

10 changes: 5 additions & 5 deletions content/collections/analytics/en/microscope.md
Original file line number Diff line number Diff line change
Expand Up @@ -37,9 +37,9 @@ From here, depending on the chart type, you can:
* **Watch Session Replays** of those user sessions.
* **Exclude** or remove distracting or irrelevant series from the analysis.
* **Create a cohort** of the users that make up the selected data point, which you can then further analyze by applying this [cohort](/docs/analytics/behavioral-cohorts) to other charts in Amplitude. When you apply a group in the Segmentation Module, you can also create a group cohort from here.
* **View a list of all the users** in the selected data point. Click a user ID to open that user's profile in the *User Activity* tab. If you are a customer with [account-level reporting](/docs/analytics/account-level-reporting), you can also use Microscope to view the groups in a data point. Click any group to see a list of users in that group, in the *User Activity* tab.
* **View a list of all the users** in the selected data point. Click a user ID to open that user's profile in the *User Activity* tab. If you are a customer with [account-level reporting](/docs/analytics/account-level-reporting), you can also use Microscope to view the groups in a data point. Click any group to view a list of users in that group, in the *User Activity* tab.
* **Download all the users** (up to 1 million) that make up the selected data point, in the form of a .CSV file. This file also contains each user's most-recently sent user property values.
* **Add users to...** a [Feature Flag](/docs/feature-experiment/workflow/feature-flag-rollouts), [Feature Experiment](/docs/feature-experiment), [Web Experiment](/docs/web-experiment), [Guide, or Survey](/docs/guides-and-surveys).
* **Add users to...** a [Feature Experiment](/docs/feature-experiment/overview), [Feature Flag](/docs/feature-experiment/workflow/feature-flag-rollouts), [Web Experiment](/docs/web-experiment/set-up-a-web-experiment), [Guide, or Survey](/docs/guides-and-surveys).

{{partial:admonition type="note" heading=""}}
If you are conducting [account-level reporting](/docs/analytics/account-level-reporting) analysis, you can opt to download the groups included in a certain data point or bucket. The .CSV file includes the following four columns:
Expand Down Expand Up @@ -79,15 +79,15 @@ Microscope actions such as `View user streams` and `Create cohort` are not suppo

## View user streams

When using Microscope in an [Event Segmentation](/docs/analytics/charts/event-segmentation/event-segmentation-build) chart, you can see individual user streams in aggregate by selecting *View User Streams*. All a user's events within the date range of the data point are visible here, as well as:
When using Microscope in an [Event Segmentation](/docs/analytics/charts/event-segmentation/event-segmentation-build) chart, you can view individual user streams in aggregate by selecting *View User Streams*. All a user's events within the date range of the data point are visible here, as well as:

* Up to 25 events **prior to** the beginning of the time range.
* Up to 50 events **after** the start of the time range.

If you have a specific event selected, it's highlighted in the user's stream. You can also choose to show certain event properties as well. Click a user ID or any event in a user's stream to view their profile in the *[User Activity](/docs/analytics/user-data-lookup)* tab.

{{partial:admonition type="note" heading=""}}
Event names with a *sparkle* icon indicate that Amplitude has generated a name to provide more context around the action a user is taking. These are Autocapture events ingested as `Page Viewed`, `Element Clicked`, and `Element Changed`, but Amplitude uses property information to make them more valuable in the event stream. Click any of them to see their ingested name and properties.
Event names with a *sparkle* icon indicate that Amplitude has generated a name to provide more context around the action a user is taking. These are Autocapture events ingested as `Page Viewed`, `Element Clicked`, and `Element Changed`, but Amplitude uses property information to make them more valuable in the event stream. Click any of them to understand their ingested name and properties.
{{/partial:admonition}}

### View Session Replay from a user's event stream
Expand All @@ -100,7 +100,7 @@ While using Microscope in a supported chart, click on *View User Streams*. Check

In a funnel chart, click into any step after the initial event to enable the **Explore Conversion Drivers** feature. This allows you to explore events triggered **between** funnel steps for converted and dropped-off users.

For more information, see [Amplitude's conversion drivers feature](/docs/analytics/charts/funnel-analysis/funnel-analysis-identify-conversion-drivers).
For more information, go to [Amplitude's conversion drivers feature](/docs/analytics/charts/funnel-analysis/funnel-analysis-identify-conversion-drivers).

## Create a guide or survey from Microscope

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -240,6 +240,6 @@ The Purchase by item hub is a good starting point for ecommerce analysis. To exp

## Create a web experiment for specific URLs

On the Page Engagement tab of Marketing Analytics, when you breakdown your data by Page URL, you can create a [web experiment](/docs/web-experiment) from the table.
On the Page Engagement tab of Marketing Analytics, when you breakdown your data by Page URL, you can create a [web experiment](/docs/web-experiment/set-up-a-web-experiment) from the table.

Click the flask icon in the Action column of the table, and the New Web Experiment dialog appears pre-populated with the targeted page URL.
6 changes: 3 additions & 3 deletions content/collections/analytics/en/user-data-lookup.md
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ The user history panel has 6 tabs, Activity, Insights, Session Replays, Cohorts,

The event stream displays a list of all the events the user performed in your application. Filter the list by event type or a specific device ID. Enable *Live event updates* for a real time feed of the user's activity in your application.

Amplitude groups the event stream by **session**, and orders it in **reverse chronological order**, placing the session property with the most recent activity at the top of the list. Blue events in a session are all connected by a line; green, out-of-session events stand alone. Customize the events you want to see in the event stream by choosing to show all events, highlight specific events, or only show specific events. You can also filter on a particular device ID.
Amplitude groups the event stream by **session**, and orders it in **reverse chronological order**, placing the session property with the most recent activity at the top of the list. Blue events in a session are all connected by a line; green, out-of-session events stand alone. Customize the events you want to view in the event stream by choosing to show all events, highlight specific events, or only show specific events. You can also filter on a particular device ID.

There are two available views: the **Info** view, which gives a digestible view of the event data, and the **Raw** view, which displays the raw JSON file Amplitude received, along with any user properties that persisted from previous events or Identify requests. This is useful for debugging the data your product sends to Amplitude.

Expand Down Expand Up @@ -80,7 +80,7 @@ If the difference between `server_received_time` and `client_upload_time` is les

This occurs automatically for projects with a [project ID](/docs/admin/account-management/manage-orgs-projects#view-and-edit-your-project-information) of 243704 or higher. To apply this 60-second cutoff time to an older project, contact Amplitude Support.

[See this blog post for more detail](https://amplitude.com/blog/dont-trust-client-data).
[Go to this blog post for more detail](https://amplitude.com/blog/dont-trust-client-data).
{{/partial:admonition}}

Daily exported files use `server_upload_time` and all dashboards use `event_time`. Queries on raw data should use `event_time`.
Expand All @@ -103,7 +103,7 @@ The *Cohorts* tab enables you to check if the user is in any of your project's c

### Experiments

The Experiments tab shows any [feature experiments](/docs/feature-experiment) or [web experiments](/docs/web-experiment) of which the user is a part.
The Experiments tab shows any [feature or web experiments](/docs/experiment-home) of which the user is a part.

### Flags

Expand Down
29 changes: 29 additions & 0 deletions content/collections/browser_sdk/en/browser-sdk-2.md
Original file line number Diff line number Diff line change
Expand Up @@ -219,6 +219,35 @@ With the default logger, extra function context information is output to the dev
- `time`: Start and end timestamp of the function invocation.
- `states`: Useful internal states snapshot before and after the function invocation.

## Performance

The Browser SDK 2 minimizes its impact on page performance through event batching, asynchronous processing, and optimizing bundle sizes.

### Bundle size

The Browser SDK 2 bundle size varies based on the installation method and features you use.

{{partial:bundle-size :package_name="package_name"}}

For the most up-to-date bundle size information, check the [npm package page](https://www.npmjs.com/package/@amplitude/analytics-browser) or [BundlePhobia](https://bundlephobia.com/package/@amplitude/analytics-browser).

### Runtime performance

The Browser SDK 2 runs asynchronously and doesn't block the main thread during event tracking. Performance characteristics include:

- **Event tracking**: Event tracking operations are non-blocking and typically complete in less than 1ms for each event.
- **Network requests**: Events are batched and sent asynchronously, minimizing network overhead. The default configuration batches up to 30 events or sends every 1 second, whichever comes first.
- **Memory usage**: The SDK maintains a small in-memory queue for event batching. Memory usage scales with the number of queued events (default: up to 30 events).
- **CPU impact**: Event processing and batching operations have minimal CPU impact, typically less than 1% of CPU time during normal operation.

### Optimization tips

To further optimize performance:

- Adjust `flushQueueSize` and `flushIntervalMillis` to balance between network efficiency and memory usage.
- Use the `offline` mode to defer event uploads when network conditions are poor.
- Enable `useBatch` mode for high-volume event tracking to reduce the number of HTTP requests.

## Autocapture <a id="tracking-default-events"></a>

Starting in SDK version 2.10.0, the Browser SDK can autocapture events when you enable it, and adds a configuration to control the collection of autocaptured events. Browser SDK can autocapture the following event types:
Expand Down
2 changes: 1 addition & 1 deletion content/collections/data/en/amplitude-shopify-plugin.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ The [Amplitude Shopify Plugin](https://apps.shopify.com/amplitude) enables you t

## Overview

The Shopify plugin installs a version of the [Amplitude Browser SDK](/docs/sdks/analytics/browser/browser-sdk-2) and adds the script before the `</head>` tag of your site's pages. The script includes [Session Replay](/docs/session-replay) and [Web Experiment](/docs/web-experiment).
The Shopify plugin installs a version of the [Amplitude Browser SDK](/docs/sdks/analytics/browser/browser-sdk-2) and adds the script before the `</head>` tag of your site's pages. The script includes [Session Replay](/docs/session-replay) and [Web Experiment](/docs/web-experiment/set-up-a-web-experiment).

{{partial:admonition type="warning" heading="Shopify and flickering"}}
The method Shopify uses to loads Amplitude's Shopify app causes flickering. To avoid this, add the [asynchronous web script with the anti-flicker snippet](/docs/web-experiment/implementation#async-script-with-anti-flicker-snippet) to your `theme.liquid` file.
Expand Down
14 changes: 7 additions & 7 deletions content/collections/guides_and_surveys/en/experiments.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ landing: false
Knowing what your users respond to best is tricky. To help with this challenge, Guides and Surveys works with Amplitude Experiment. When you install the [Guides and Surveys SDK](/docs/guides-and-surveys/sdk), you get everything you need to run experiments on your Guides and Surveys.

{{partial:admonition type="note" heading="Manager or Administrator role required"}}
Running an experiment on your guide or survey requires the Manager role at a minimum. For more information about how roles impact who can use Guides and Surveys, see [Getting Started | Roles and Permissions](/docs/guides-and-surveys/get-started#roles-and-permissions).
Running an experiment on your guide or survey requires the Manager role at a minimum. For more information about how roles impact who can use Guides and Surveys, go to [Getting Started | Roles and Permissions](/docs/guides-and-surveys/get-started#roles-and-permissions).
{{/partial:admonition}}

## Run an experiment
Expand Down Expand Up @@ -38,11 +38,11 @@ Choose a [Multi-armed Bandit](/docs/feature-experiment/workflow/multi-armed-band
After you select an experiment type, Guides and Surveys adds two variants with autogenerated keys. To rename the variant, select it and click *More options*. Update the name, duplicate, or delete the variant here.

{{partial:admonition type="warning" heading="Complete experiment setup"}}
Adding variants is only the first part of experimentation in Guides and Surveys. To ensure users see variants as they should:
Adding variants is only the first part of experimentation in Guides and Surveys. To ensure users experience variants as they should:

1. Make sure the experiment is running. Define a goal, review targeting, and click *Start Experiment*. For more information, see [Manage the experiment](#manage-the-experiment).
2. If an specific user doesn't see a variant, ensure they're part of the experiments target audience.
3. If a user sees one variant, they should continue to see that variant. Navigate to *Users > User Profiles*. Search for the user and open their profile. Look at the *Guide* and *Survey* tabs to view which experiences they've seen.
1. Make sure the experiment is running. Define a goal, review targeting, and click *Start Experiment*. For more information, go to [Manage the experiment](#manage-the-experiment).
2. If an specific user doesn't experience a variant, ensure they're part of the experiments target audience.
3. If a user sees one variant, they should continue to receive that variant. Navigate to *Users > User Profiles*. Search for the user and open their profile. Look at the *Guide* and *Survey* tabs to view which experiences they've seen.
{{/partial:admonition}}

### Manage the experiment
Expand All @@ -53,7 +53,7 @@ Click *Manage Experiment* to open the experiment editor in a new tab. The experi
Variant names stay in sync between your guide or survey and the experiment when you save the guide or survey.
{{/partial:admonition}}

For more information about working with experiments, see [Feature Experiment](/docs/feature-experiment)
For more information about working with experiments, go to [Feature Experiment](/docs/feature-experiment/overview)

{{partial:admonition type="tip" heading="Exposures and assignments"}}
Exposure events in Guides and Surveys experiments work similarly to a standard experiment, but there are cases that can cause a uneven split between between control and variant exposures. The way in which you set targets and limits impacts the frequency with which treatment exposures occur.
Expand Down Expand Up @@ -91,7 +91,7 @@ Track guide and survey engagement trends over predefined time periods.
* Monthly
* Quarterly

With these presets, see when users are most likely to engage with the guide or survey and if engagement changes after, for example, a new product release.
With these presets, find when users are most likely to engage with the guide or survey and if engagement changes after, for example, a new product release.

#### Date range selection

Expand Down
32 changes: 32 additions & 0 deletions content/collections/session-replay/en/session-replay-plugin.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,6 +30,38 @@ Amplitude built Session Replay to minimize impact on the performance of web page
- Optimizing DOM processing.
{{/partial:admonition}}

## Performance

Session Replay minimizes its impact on page performance through asynchronous processing, efficient compression, and optimizing bundle sizes.

### Bundle size

The Session Replay plugin adds to your application's bundle size.

{{partial:bundle-size :package_name="package_name"}}

For the most up-to-date bundle size information, check the [npm package page](https://www.npmjs.com/package/@amplitude/plugin-session-replay-browser) or [BundlePhobia](https://bundlephobia.com/package/@amplitude/plugin-session-replay-browser).

### Runtime performance

Session Replay runs asynchronously and processes replay data in the background to avoid blocking the main thread. Performance characteristics include:

- **DOM capture**: DOM snapshot capture typically adds less than 5ms of processing time for each page interaction. Initial page load snapshot capture may take 10-50ms depending on page complexity.
- **Memory usage**: Session Replay stores replay events in memory or IndexedDB (configurable using `storeType`). Memory usage scales with session length and page complexity, typically ranging from 1-10 MB for each active session.
- **CPU impact**: With default settings, Session Replay uses less than 2% of CPU time during normal operation. Compression operations are deferred to browser idle periods when `performanceConfig.enabled` is `true` (default).
- **Network bandwidth**: Replay data is compressed before upload, typically reducing payload size by 60-80%. Network requests are batched and sent asynchronously.

### Performance optimization

To optimize Session Replay performance:

- Enable `useWebWorker` to move compression off the main thread, reducing CPU impact on the main thread.
- Configure `performanceConfig.timeout` to control when deferred compression occurs.
- Use `sampleRate` to reduce the number of sessions captured, which directly reduces CPU and memory usage.
- Set `storeType` to `memory` if you don't need persistence across page reloads, reducing IndexedDB overhead.

For detailed performance testing results, see the [Session Replay performance testing blog post](https://amplitude.com/blog/session-replay-performance-testing).

Session Replay captures changes to a page's Document Object Model (DOM), including elements in the shadow DOM, then replays these changes to build a video-like replay. For example, at the start of a session, Session Replay captures a full snapshot of the page's DOM. As the user interacts with the page, Session Replay captures each change to the DOM as a diff. When you watch the replay of a session, Session Replay applies each diff back to the original DOM in sequential order, to construct the replay. Session replays have no maximum length.

## Before you begin
Expand Down
Loading
Loading