The quickest way from AWS S3 to PowerBI ?

Johan
3 min readOct 17, 2023

using Snowflake …………

Sure, we need to do some infrastructure setup, for that, use the links below. And I had to use the Guy in a Cube blog to figure out how to use PowerBI with JSON. But after absorbing this PowerBI knowledge, it was actually pretty easy to run a PowerBI report against AWS S3 files.

Put you file on S3 and create an external table. Once the file is on S3, you can do a LIST <stage> to check it. If the list statement gives an error, your SF-AWS infrastructure does not have the correct setup.

Now, create the external table. The easiest way is to create a schema-on-read table, meaning that Snowflake will store the data in the VARIANT datatype, as a JSON.

Snowflake needs a file format to be able to interpret the data, code included below. For the external table definition: put the files in their own folder on S3, so you can avoid PATTERN constructs in the DDL.

create or replace file format sf_aws.public.ff_csv
type = csv;

create or replace external table sf_aws.public.nile_ext
location=@sf_aws.public.my_s3_stage/nile/
file_format = 'sf_aws.public.ff_csv';

When we query the external table, we see the JSON content.

You might ask: “if the input file is a csv, why do we then represent this as JSON in the external table?”. The answer is: “to do this faster, not requiring any metadata analysis on the csv”. But, feel free to solve this differently, with a view or dynamic table if you wish.

Now, let’s go to PowerBI. Connect to Snowflake and load the nile_ext table, which will need some transform logic (thanks to Guy in a Cube for pointing me there, link below).

Be aware, this JSON Transform will ONLY work in import mode, so no direct query here.

After selecting the columns, the output is nicely formatted.

And then, build a report.

So, to conclude: this was just a quick attempt to go from S3 to Snowflake in the shortest way. It was not tested beyond what is described here above. So you probably should not implement this in a workplace setup without proper testing. But, it was still nice setup to work with.

Links:

Guy in a Cube

AWS-SF Setup

Plenty of csv files to be found here

--

--

Johan

I am a freelance data engineer. My main focus is Snowflake, but I am always eager to learn new things. Python and AWS are my main side interests