Forum Replies Created

  • Sure, let me break this down for you!

    Now, blocking or not having cookies doesn’t stop tracking from occurring. It only tends to lose the context of tracking, like the session and the user scope. In your case additional tracking isn’t required as you’re already logged into Stack Overflow.

    As for OneTrust, I don’t think it overrides the cookie setting method. Rather, it tends to regularly check or ‘poll’ cookies and deletes those considered unnecessary or irrelevant. But tracking library often sets a cookie and use it instantaneously making it extremely hard for any tool to intercept in between. That cookie serves more as a throwaway placeholder with random numbers to uphold the context for the next call.

    In a perfect world, stopping analytics from firing when consent is not granted would be done through blockers in Tag Management Systems (TMS). This would stop tracking logic from even being triggered if consent isn’t given. But it seems like Stack Overflow doesn’t pay much attention to these subtleties!

  • It sounds like you’re using GA4, not Universal Analytics. Here, you need to set up the event_label and event_category parameters as custom dimensions in your Google Analytics profile. Please note that it generally takes about 48 hours for changes to take effect. Until then, you might see the result value as “not set”. Importantly, once you’ve created a custom dimension, its associated event parameter cannot be changed. After this setup, you should be able to view the relevant data in your reports. However, remember that this will not provide historical data. The custom dimension begins collecting data from the time it is created, not retrospectively.

  • Sure! When switching to GA4 from Universal Analytics, the events need to be translated uniformly. In the case of Universal Analytics, ‘action’ (ea) maps to ‘event_name’ in GA4. Similarly, ‘event_category’ (ec), ‘event_label’ (el) and ‘event_value’ (ev) from Universal Analytics change to parameters with the same names in GA4 along with their respective parameter values.

    For instance, if you had an event in GA4 that corresponded to ‘Listen’ in Universal Analytics, you’d structure it like:

    gtag('event', 'Listen', {   
    'event_category': 'Stream',   
    'event_label': 'The Show',   
    'event_value': '10' 
    });
    

    Here’s a handy Google help center article to understand this better.

  • To track unique logins into your event, you will need to use Dimensions instead of Metrics in Google Analytics API. A particular type of dimension you could use is User Scoped Custom Dimensions. Metrics essentially quantify measurements (like total events) while dimensions provide context to those (like which users caused those events). With User Scoped Custom Dimensions, you can assign a unique identifier (like an internal user id) to users when they log in. Then, when invoking the API, you can include this Custom Dimension in your request to help uniquely identify users. However, ensure that you are not violating Google Analytics Terms of Service regarding sending Personally Identifiable Information (PII).

  • It seems like your issue could be related to the syntax in your code. The liquid variables, which output strings, should be enclosed in quotes. Additionally, consider adding conditional statements to prevent certain name-value pair from appearing when they have no values. An example of how you could adjust your code to this effect has been provided. The code shows the incorporation of quotation marks on certain variables and conditional statements to prevent the appearance of name-value pairs without values. Also note the ‘ecommerce: null’ statement before you push your ‘view_item’ event.

  • It appears you’ve encountered a fairly unusual issue with the Google Analytics Reporting API. After thoroughly checking your request and dates, it’s quite strange for userEngagementDuration to produce inconsistent results, especially isolated to pairing with the medium dimension. As you mentioned, there is no sampling level issue with BatchRunReportRequest to attribute to this problem.

    There are few things you could potentially explore:

    1. Dimensions and metrics can sometimes behave unexpectedly when combined. It may help to verify all metrics and dimensions on the Google Developers metrics & dimensions explorer page and see if there may be any known quirks or compatibility issues when userEngagementDuration is paired with medium dimension.

    2. Cross-check to confirm any filters involved are working as intended without inadvertent exclusions or inclusions which can affect results.

    3. It is also worth checking your API version as Google occasionally updates these and changes may not always be backward compatible.

    You should also consider reaching out to Google Development Support as they may be able to provide more specific guidance, or it may be a potential bug that they might need to look into.

  • While the query seems to be working generally, the higher engagement rate suggests that there might be an overcount of the engaged sessions. This could be due to any number of reasons, such as duplicate session data in the raw table, incorrect filtration on the query, or even a miscalculation of the engaged sessions. Checking for these issues may help. Incorrect schema or data manipulation can also affect the results. Additionally, if the engagement goals or parameters are set differently in BigQuery compared to GA4, it could account for the discrepancy in numbers. Without further information and troubleshooting, it’s hard to say for certain why the results are off. You may want to consult with a BigQuery expert or hire a data analyst who could dig deeper into the data and potentially figure out what’s causing the anomaly.

  • Rishi

    Member
    27 April 2023 at 9:44 am in reply to: Updating a Web Stream's URL with the updateMask

    Hey Patricia!

    How about trying to specify updateMask as a query parameter? Here are a couple of ways to do so:

    – [Option 1: Using web_stream_data.default_uri](https://developers.google.com/analytics/devguides/config/admin/v1/rest/v1beta/properties.dataStreams/patch?updateMask=web_stream_data.default_uri)
    – [Option 2: Using webStreamData.defaultUri](https://developers.google.com/analytics/devguides/config/admin/v1/rest/v1beta/properties.dataStreams/patch?updateMask=webStreamData.defaultUri)

    Give either of these a go and let’s see how it turns out!

  • Hey, it seems like the report you’re referring to on Google analytics is comprised of several dimensions, including ‘Date’ as well as a few metrics, namely ‘users’, ‘sessions’ and ‘pageviews’.

    Now, I noticed that you’ve added ‘datehour’ to your report. This likely causes a discrepancy in your results unless your request is identical to the report you’re looking at. So, to get an accurate user count, I’d suggest making sure your requests match the report parameters. Cheers!

  • The user seems to be having trouble with the disappearance of ‘item_category’ parameters in their Explore reports. Despite their data and configuration methods seeming to be sound, the ‘item_category2’, ‘item_category3’, ‘item_category4’, and ‘item_category5’ compartments, which they typically used to gather specific ecommerce information, no longer seem to be visible in the reports. This has limited their ability to properly conduct ecommerce performance analysis. They are asking for insights into how such ecommerce performance evaluations can be conducted without the specific ‘item_category’ parameters, and why these parameters are not appearing in the reports despite no change in their data collection and processing methods.