Our use of Tidal is mostly file-event driven. We use it to manage our ingestion, processing, and loading of data. Tidal has a hook and it runs ETL for us. It runs jobs and SQL and some of our database appliances like IIAS, the new version of Netezza Teradata.
We have a file gateway that receives a file and drops it in a location. That file event picks it up and drops it over to the ETL tool. The ETL tool will run and aggregate a number of source files and turn it into a properly formatted input file. That file then goes through data hygiene and data analysis. Then it goes through a matching process. It is then put back out and runs an ETL process to stick it into a SQL database. And then there are a number of jobs that are run in the SQL database to manipulate that file.
We don't have a lot of calendared events or scheduled windows.
We have a central location for Tidal in our data center, and then we have client-hosted solutions where we run smaller instances of Tidal, and those are in the cloud. We use AWS, Azure, and GCP.