|
| 1 | +--- |
| 2 | +title: 'User impact: Analyze frustration signals and performance impact' |
| 3 | +tags: |
| 4 | + - Browser |
| 5 | + - Browser monitoring |
| 6 | + - Additional standard features |
| 7 | + - User experience |
| 8 | + - Session replay |
| 9 | +metaDescription: "Analyze user frustration signals, performance impact, and session replays to understand how technical issues affect your users' experience with New Relic browser monitoring." |
| 10 | +freshnessValidatedDate: never |
| 11 | +--- |
| 12 | + |
| 13 | +The <DNT>**User impact**</DNT> page in <InlinePopover type="browser"/> helps you understand how technical performance issues affect your users' actual experience. This page bridges quantitative performance metrics with qualitative user behavior analysis, providing insights into user frustration patterns and their correlation with technical problems. |
| 14 | + |
| 15 | +The User impact page consists of two main analysis areas: <DNT>**Frustration metrics**</DNT> and <DNT>**Performance impact**</DNT>. Each provides different lenses for understanding user experience issues and their technical root causes. |
| 16 | + |
| 17 | +## Access User impact analysis [#access] |
| 18 | + |
| 19 | +To view the User impact page: |
| 20 | + |
| 21 | +1. Go to <DNT>**[one.newrelic.com > All capabilities](https://one.newrelic.com/all-capabilities) > Browser > (select an app)**</DNT>. |
| 22 | +2. In the left navigation under <DNT>**More views**</DNT>, click <DNT>**User impact**</DNT>. |
| 23 | +3. Select either <DNT>**Frustration metrics**</DNT> or <DNT>**Performance impact**</DNT> from the tab. |
| 24 | + |
| 25 | +You can also navigate here directly from the [Page views](/docs/browser/new-relic-browser/browser-pro-features/page-views-examine-page-performance) page by clicking <DNT>**Analyze frustration signals**</DNT> or <DNT>**Analyze performance impact**</DNT> link. |
| 26 | + |
| 27 | +## Analyze frustration metrics [#frustration-metrics] |
| 28 | + |
| 29 | +The Frustration metrics tab focuses on user behavioral indicators that signal confusion, annoyance, or difficulty completing tasks. This analysis helps you identify where users struggle most and prioritize fixes based on actual user pain points. |
| 30 | + |
| 31 | +### Rage click analysis [#rage-clicks] |
| 32 | + |
| 33 | +Rage clicks occur when users rapidly click the same element multiple times, typically indicating an unresponsive interface or broken functionality. |
| 34 | + |
| 35 | +**Rage click trend chart:** |
| 36 | + |
| 37 | +* Shows overall frustration patterns over time with deployment markers for correlation |
| 38 | +* Helps identify if recent deployments increased user frustration |
| 39 | +* Displays the overall frustration trend across your application |
| 40 | + |
| 41 | +**Top affected pages table:** |
| 42 | + |
| 43 | +* Lists browser interactions (SPA) or browser transactions (standard) with the highest rage click activity |
| 44 | +* Shows total rage clicks, associated errors, and available session replay counts for each page |
| 45 | +* Click session replay counts to view filtered sessions with frustration signals |
| 46 | +* Includes a "View all" option to explore the complete list with search capabilities |
| 47 | + |
| 48 | +**Rage click events chart:** |
| 49 | + |
| 50 | +* Displays a stacked bar chart showing rage click trends for specific pages |
| 51 | +* Use the dropdown to filter by the top affected pages |
| 52 | +* Analyze patterns for individual browser interactions or transactions |
| 53 | + |
| 54 | +### Top affected page elements [#affected-elements] |
| 55 | + |
| 56 | +This section provides granular analysis of which specific UI elements cause the most user frustration: |
| 57 | + |
| 58 | +**Metadata section:** |
| 59 | +* Shows the percentage of rage clicks among total user interactions |
| 60 | +* Provides context for how widespread frustration signals are relative to normal usage |
| 61 | + |
| 62 | +**Elements table:** |
| 63 | +* **Target tag and target class**: Identifies the specific HTML elements users rage-click most frequently |
| 64 | +* **Rage click rate**: Shows the percentage of sessions that included rage clicks on each element |
| 65 | +* **Most affected URLs**: Displays which pages contain the problematic elements |
| 66 | +* **Session replays**: Count of available replays for analysis (click to view filtered sessions) |
| 67 | +* **Recommended sessions**: Direct access to sessions where rage clicks correlate with JavaScript errors |
| 68 | + |
| 69 | +<Callout variant="tip"> |
| 70 | + Recommended sessions prioritize replays where rage clicks immediately follow JavaScript errors, providing the strongest correlation between technical issues and user frustration. |
| 71 | +</Callout> |
| 72 | + |
| 73 | +## Analyze performance impact [#performance-impact] |
| 74 | + |
| 75 | +The Performance impact tab examines how technical performance problems affect user sessions, focusing on measurable performance metrics and their correlation with user experience issues. |
| 76 | + |
| 77 | +### Web vitals monitoring [#web-vitals] |
| 78 | + |
| 79 | +Core Web Vitals provide standardized metrics for user experience quality: |
| 80 | + |
| 81 | +* **Largest Contentful Paint (LCP)**: Loading performance measurement |
| 82 | +* **Interaction to Next Paint (INP)**: Responsiveness to user interactions |
| 83 | +* **Cumulative Layout Shift (CLS)**: Visual stability during page load |
| 84 | + |
| 85 | +Each metric displays current values with status indicators (good, needs improvement, poor) based on Google's recommended thresholds. |
| 86 | + |
| 87 | +### Error analysis [#error-analysis] |
| 88 | + |
| 89 | +**Error rate trend:** |
| 90 | +* Shows JavaScript error patterns over time with deployment markers |
| 91 | +* Helps correlate error spikes with specific deployments or changes |
| 92 | +* Displays overall error trends across your application |
| 93 | + |
| 94 | +**Error rate distribution:** |
| 95 | +* Provides a breakdown of the most frequently occurring error types |
| 96 | +* Shows error counts faceted by error messages |
| 97 | +* Click specific error types to navigate to detailed error analysis in Error Inbox |
| 98 | + |
| 99 | +### Top affected pages analysis [#affected-pages-performance] |
| 100 | + |
| 101 | +This table identifies which pages experience the most performance-related user impact: |
| 102 | + |
| 103 | +* **Page URL**: Browser interactions (SPA) or browser transactions (standard) with performance issues |
| 104 | +* **Error class and error message**: Specific types of errors occurring on each page |
| 105 | +* **Error rate**: Percentage of sessions affected by errors on each page |
| 106 | +* **Session replays**: Available replay counts for affected sessions |
| 107 | +* **Recommended sessions**: Direct access to sessions with the highest correlation between errors and user impact |
| 108 | + |
| 109 | +Click session replay counts to view sessions filtered by error class and error message, maintaining analysis context. |
| 110 | + |
| 111 | +### Geography analysis [#geography] |
| 112 | + |
| 113 | +Understanding performance impact by geographic location helps identify regional infrastructure or connectivity issues: |
| 114 | + |
| 115 | +**Performance map:** |
| 116 | +* Visual representation of error rates or error counts by geographic region |
| 117 | +* Toggle between error rate (percentage) and error count (absolute numbers) views |
| 118 | +* Zoom functionality to focus on specific regions |
| 119 | + |
| 120 | +**Geography table:** |
| 121 | +* **Location data**: Country, region, or city-level performance breakdown |
| 122 | +* **Error metrics**: Error rates and counts for each geographic area |
| 123 | +* **Session replays**: Available replay counts for geographic segments |
| 124 | +* **Favoriting**: Mark frequently monitored regions for quick access |
| 125 | +* **Filtering**: Click rows to apply geographic filters and zoom the performance map |
| 126 | + |
| 127 | +**Key attributes breakdown:** |
| 128 | +* **Country code and city**: Geographic segmentation options |
| 129 | +* **Browser and browser version**: Technology stack correlation with performance issues |
| 130 | +* **Device type**: Performance differences across device categories |
| 131 | +* **Page URL**: Most problematic pages in each geographic region |
| 132 | + |
| 133 | +<Callout variant="important"> |
| 134 | + Geography analysis requires sufficient data volume to provide meaningful insights. Low-traffic applications may see limited geographic granularity. |
| 135 | +</Callout> |
| 136 | + |
| 137 | +## Navigation and filtering [#navigation-filtering] |
| 138 | + |
| 139 | +### Cross-page workflows [#cross-page] |
| 140 | + |
| 141 | +The User impact page integrates seamlessly with other browser monitoring features: |
| 142 | + |
| 143 | +* **From Page views**: Navigation preserves time ranges and applied filters |
| 144 | +* **To Session Replay**: Session replay counts maintain context with relevant filters applied |
| 145 | +* **To Error Inbox**: Error distribution links carry error type filters |
| 146 | +* **To Browser pages**: Key attributes link to detailed browser analysis |
| 147 | + |
| 148 | +### Filter management [#filter-management] |
| 149 | + |
| 150 | +* **Persistent filters**: Applied filters remain active during page reloads within the same analysis type |
| 151 | +* **Context switching**: Filters reset when switching between Frustration metrics and Performance impact tabs |
| 152 | +* **Dynamic filtering**: Supports device type, page URL, user agent, and geographic filters |
| 153 | +* **Time range synchronization**: Time selections apply consistently across all charts and tables |
| 154 | + |
| 155 | +### Session Replay integration [#session-replay-integration] |
| 156 | + |
| 157 | +Session replay counts throughout the User impact page provide direct access to user sessions: |
| 158 | + |
| 159 | +* **Contextual filtering**: Replays open with relevant filters (rage click events, error conditions, page context) |
| 160 | +* **Recommended prioritization**: "Recommended sessions" highlight replays with the strongest correlation between technical issues and user frustration |
| 161 | +* **Additional context**: Session replay tables include frustration signal counts for each session |
| 162 | + |
| 163 | +<Callout variant="caution"> |
| 164 | + Session replay availability depends on your [sampling configuration](/docs/browser/browser-monitoring/browser-pro-features/session-replay#configure-sampling). Consider increasing sampling rates if you consistently see low replay counts for high-impact issues. |
| 165 | +</Callout> |
0 commit comments