This is a net script I have dug for one of our user asking on support how to easily trigger snapshot for multiple tables. Please not this could be (as python transformation) orchestrated for instance in your weekly/monthly flows to create snapshots of critical data that will last longer than automatic time-travel functionality.

Please note it is not ideal, since it exposes the token (but you can either create a dedicated one, or reset your personal after use).
All you have to use this in a python transformation and replace is the token and the table list:

table_names = ["out.c-bitbucket.repositories", "out.c-bitbucket.repository_stats"]

The whole script is following:

import requests
import sys
# Please enter token here:
# List table names here: ["table1","table2","etc."]
table_names = ["out.c-bitbucket.repositories", "out.c-bitbucket.repository_stats"]
def send_request(table_name):
    # Create Table Snapshot
    # POST
    call_url = (""+str(table_name)+"/snapshots")
        response =
            url = call_url,
            headers = {
                "X-StorageApi-Token": TOKEN,
                "Content-Type": "application/x-www-form-urlencoded",
            data={"description": "orchestration",},
        print('Response HTTP Status Code: {status_code}'.format(
        print('Response HTTP Response Body: {content}'.format(
    except requests.exceptions.RequestException:
        print('HTTP Request failed')
for table in table_names:


If you have project on other maintainer, such as PAYG in N-EU stack (Azure), please use corresponding base URL.

Possible improvements:

  1. Create a dedicated component to protect the token (any volunteers?)
  2. Combine this with storage API to get the list of all tables so you can just specify the bucket and call the day...


Hope this helps!