My case for CI/CD in Snowflake is very specific: every 30 days I register for a new Snowflake 30 day trial, to run several experiments. When spinning up this new environment, it contains only the bare necessities: some roles (ACCOUNTADMIN, SYSADMIN, etc) and some databases (DEMO_DB plus the shared SNOWFLAKE databases).
So, my experiment can now start. It contains of a lot of SQL scripts that I need every time in my new Snowflake account.
For this I have chosen the most basic approach I can think of. Some people would call it an MVP :)
It consist of:
- a Python script
- a folder with SQL scripts
- a JSON file with Snowflake credentials
When you run the Python script, it will open the json, set up a connection with Snowflake:
def get_conn():
with open('../config/sf-config.json','r') as file:
data = json.load(file)
user = data['user']
password = data['password']
account = data['account']
conn = snowflake.connector.connect(
user = user,
password = password,
account = account
)
return conn
Then, the folder is browsed, sorted in alphabetical order:
def execute_sql_scripts(con):
cur = con.cursor()
for root, dirnames, filenames in os.walk("./sql"):
filenames = sorted(filenames)
for filename in fnmatch.filter(filenames, '*.sql'):
sql…