Forum Replies Created

  • Firstly, from your code, it appears your regex filter for ‘value’ is incorrect. When using FULL_REGEXP, the ‘value’ should just be the regex pattern you are trying to match against, without the field name. For example, for your ‘hostName’ filter, it should be ‘value’ => ‘www.site.com’ (note the escaping for periods), not ‘hostName==www.site.com’. Similarly, for ‘pageReferrer’, ‘value’ => ‘^https://.*’ should work.

    Secondly, the use of !~ operator in the value for ‘pageReferrer’ filter seems to suggest you are trying to get results where ‘pageReferrer’ does NOT start with https://. If that is the case, it would be the reason you aren’t getting any results because your condition is that hostname should be http://www.site.com and pageReferrer should NOT start with https://. So, if you want results where pageReferrer starts with https://, you might want to drop the !~ for the ‘pageReferrer’ filter.

    Finally, be sure your regex patterns are correct and reviewing your regex syntax rules may also be helpful. Test your regular expressions thoroughly to ensure that they are working as intended.

  • Sure I can help with that. As an initial approach, you would need to identify where the ‘video_complete’ rows are in your nested list. Once you’ve managed to do that, a possible way to get ‘video_complete’ to be at the bottom of your list would be to use the ‘append()’ and ‘remove()’ list methods in Python (or their equivalents in other programming languages). The ‘remove()’ method will remove the first occurrence of the value specified, in your case ‘video_complete’. After removing it, you can then use the ‘append()’ method to add ‘video_complete’ back into the list but at the end, effectively placing it at the bottom of your list. This approach could be repeated in a loop for a nested list until all ‘video_complete’ rows are at the bottom.
    As a tip when dealing with sorting rows, you might want to consider using libraries or packages that could handle data manipulation and analysis more flexibly such as pandas in Python. These provide more powerful tools for sorting and manipulating data.

  • Xavier

    Member
    7 April 2023 at 8:20 am in reply to: Utilizing GA4 Data Stored in BigQuery with PowerBI Integration

    Yes, you are trying to transform your current Google Analytics 4 (GA4) data from BigQuery, currently in JSON format, into a more readable Table format for usage in Power BI. The idea here is that you need to “unnest” and “pivot” your current data into separate columns. This is unfortunately not a straightforward process if you’re new to these tools and to the JSON format, but it’s a common need in data analysis. The operation you are trying to perform is called “normalizing” the data, i.e., transforming the nested JSON-format data into flat, table-format data. This process often involves writing queries or scripts to properly extract and restructure your JSON data. You may need further tutorials or to seek advice from more experienced data analysts.

    Whether you can accomplish this task with Power BI alone or need the aid of BigQuery depends on what level of data manipulation Power BI supports. It may also be possible to solve this by using other tools or programming languages that can work with JSON and will allow you to reshape data, such as Python with pandas. If the structure of your JSON data is always the same, you should be able to automate this process once you’ve successfully done it once.

  • The reason why you’re only seeing 2.7K sessions when you choose the source or source/medium dimension could be due to the specific segment or filter you’ve applied to the view. If you’ve added a segment that only includes certain traffic sources or mediums, the total session count will drastically decrease from what you’d see without any dimensions. Also, it could be that your data isn’t well-tagged and Google Analytics can’t identify traffic sources for most of your sessions, which might lead to fewer sessions being attributed to recognized sources. Another possibility is that there may be some data sampling going on depending on the size of your dataset and the complexity of your queries. Google Analytics will sometimes sample your data to provide quicker query responses, which may not represent the total number of sessions accurately. If none of these are the case, it might be worth directly reaching out to Google Analytics support to clarify.

  • Xavier

    Member
    28 March 2023 at 12:24 pm in reply to: Inherited Permissions not Honored in GA4 Insights Notifications

    From the issue you’ve presented, it seems you’ve understood the requirements for setting up Insights correctly, but Google Analytics 4 (GA4) appears to be functioning differently than expected. The challenge here appears to be with the concept of “inheritance” of permissions in Google accounts, where GA4 isn’t recognizing permissions granted at the account or organization level for the purpose of sending out insights. Currently, GA4 seems to require explicit property-level permissions for users to receive alerts or insights, which contradicts the straightforward concept of permission “inheritance”. This could be an overlooked detail on Google’s part, a bug, or simply a matter of unclear documentation. You’ve done well by figuring out a workaround, but it’s good to share these concerns with the Google support team or via their product forums, as they may be unaware of this issue or confusion. Your feedback can help improve the experience for all Google Analytics users.

  • Xavier

    Member
    5 March 2023 at 4:50 pm in reply to: How to monitor social media interactions with Google Analytics 4

    It seems like your issue is indeed tied to the schema compatibility changes announcement. Per the change, item-scoped dimensions like itemName and itemId are no longer compatible with event-scoped metrics like eventCount in the same query. Therefore, generating a report that directly correlates itemId and eventCount in the same table is no longer feasible in the manner it was before this change. You may have to restructure your event tracking to account for these changes or you could potentially consider creating multiple reports (one for event count and another for item details) and then manually cross-referencing them. You might also consider looking into whether there are alternate events or metrics that can be used which haven’t been affected by the schema changes. Google’s documentation or support might provide further guidance specific to your case.

  • Xavier

    Member
    30 December 2022 at 6:08 pm in reply to: Considerations for Tracking Asynchronous Data in GA's page_view Event

    Firing off asynchronous data in page_view events is not necessarily a no-no, but it does introduce certain complexities that you’ve already identified. It’s crucial to recognize that the firing of the page_view event and fetching of data don’t always occur simultaneously, which can impact the accuracy and consistency of the event’s data payload. Leveraging the Redux store to trigger the event once data is available may work in some instances, but with multiple components fetching different data sets, inconsistencies may emerge.

    Your proposed solution of creating a settings file that checks, based on the route, which data fields to include with the GA event can present future maintenance difficulties due to the dynamism and evolving nature of web applications.

    An alternative might be to consider using a state management solution to aggregate data from all components before firing the page_view event, or to delay firing the event until all async operations have completed. In the end, it’s essential to ensure your solution doesn’t skew the accuracy of your analytics.

  • Xavier

    Member
    4 October 2022 at 11:24 am in reply to: How to determine page worth in Google Analytics 4?

    The error message you’re seeing, “Re-aggregating metrics is not supported,” comes up in Google Data Studio when you’re trying to sum (or average, maximum, minimum, etc.) a metric that has already been aggregated. This is typically the case when the data comes from a field that Google Analytics marks as “already aggregated”.

    To solve this error, you need to configure your settings properly in Google Tag Manager and Google Analytics before moving on to Google Data Studio. Make sure you’re sending the ad click value as an Event Value in Google Analytics.

    This would involve adding a Event tracking in your Google Analytics which records the value every time the particular event triggers (ad click in this case). Once you have that event value recorded separately, it’s not a pre-aggregated field. You can then import those custom events into Google Data Studio and use SUM aggregation on that Event Value.

    Finally, you will create a report using Page URLs as the dimension and the sum of your custom Event Values as the metric. You should be able to create the report without encountering the re-aggregation error. Remember that data changes will reflect based on new data coming in after these modifications, so previous data might not be altered.

  • Xavier

    Member
    20 May 2022 at 1:34 am in reply to: Struggling with Duplicate Parameters in GA4 Event Tracking

    Hey! This might just be how GA’s debugger shows batched events. As it’s a page_view, I’m thinking it’s an SPA since batching doesn’t function with real page loads. You can read up on the Events Grouping section [here](https://support.google.com/analytics/answer/9322688?hl=en#zippy=%2Crealtime-report%2Cdebugview-report) for more info.

    But don’t sweat looking at GA’s debugger too heavily. It’s mostly useful for confirming that the data structure you’re sending matches the one it’s receiving. For everything else, you can debug it locally. Check out the raw network requests in the Network tab or use Adswerve’s dataLayer explorer extension. You should be able to see event batching there.