Learn business growth with Google Analytics 4 Forums Google Analytics 4 Techniques for transforming GA4 event_params (JSON) into rows in SQL Server

  • Techniques for transforming GA4 event_params (JSON) into rows in SQL Server

    Posted by Mason on 30 April 2023 at 10:40 am

    Hey there,

    So, here’s the deal, we’re trying to export some Google Analytics 4 (GA4) golden nuggets from Big Query – we’re using those handy Microsoft connectors. But, there’s a catch – the event_params is showing up in some funky JSON language. You know what I’m talking about, right?

    I’ve been elbow deep trying to unpivot some other arrays from various nooks and crannies with the OPENJSON function and CROSS APPLY, but boy oh boy, this one’s a tough nut!

    It seems like there’s some how-to guides floating around the interwebs, but they’re all about doing it in Big Query, which is as much use as a chocolate teapot with our built-in connectors. So, I thought I’d ask you guys!

    Here’s what I’m aiming for – I want to flip each of the params and their non-null values into a neat little column of their own. Imagining around 10 custom columns in the end.

    I’ve included my current table layout below for easier visualization – I’ve stripped it back to just the basics for now: event_date, event_timestamp, page_location, ga_session number. You know, stuff like that.

    Now, to give you a sense of where we’re starting out, I’ve also included a sample of my data. It’s a mix of dates, timestamps and that pesky event_params in JSON. The event_params include all sorts, like ‘page_location’, ‘ga_session_number’, and ‘page_title’.

    Don’t know about you, but I’m pretty stoked to get this sorted! Any hints on how I can work around this and get the data neatly spread out in columns?

    Levi replied 11 months, 2 weeks ago 3 Members · 2 Replies
  • 2 Replies
  • Pham

    7 May 2023 at 11:36 am

    Hey, I totally feel you! I’ve had the exact same issue with GA4 data. After a ton of research, what I found is that you can actually select the column and then go to the transformation tab and click on ‘Analyse JSON’. This will give you an option to expand the column.

    But, I won’t lie – despite doing this, dealing with the resulting header columns can be a bit tricky and might not offer the best solution you’re looking for.

  • Levi

    30 June 2023 at 4:56 am

    Yes, handling JSON data can indeed be a bit tricky. But you’re on the right track with the approach you’re taking. Using the TO_JSON_STRING function in your SQL query to transform each event to JSON format before importing to your SQL database is a smart move. You also correctly identified that JSON_QUERY will be useful in extracting data from the JSON objects once you have them in your database. The flexibility of JSON syntax can sometimes bring a bit of complexity, but it also allows you to create much more customized data structures. The downside, as you’ve discovered, is the potential for more manual work parsing the JSON objects. But once you’ve set up your parsing methods, you’ll have a very flexible and potentially powerful system in place for managing your data.

    If you find yourself dealing with deeply nested JSON, just remember that you can use multiple JSON_QUERY functions, one nested inside another, to drill down to the data you need. Similarly, if you’re dealing with JSON arrays instead of singular JSON objects, you may find the JSON_ARRAY function helpful. It’s all about tools and knowing when to use them. It may not be the most elegant solution, but it certainly sounds like a workable one. Lastly, keep an eye on performance – handling lots of JSON can sometimes be processor-intensive. Keep it up!

Log in to reply.