To give you some idea of the business use: I have one table with Locations and another table with Jobs. Jobs are performed at Locations. The jobs are repeated on a somewhat unpredictable schedule. Crews will be assigned a bulk of jobs to do and when they are nearing completion with the assigned jobs, I will enter criteria in the search form for the next locations and then trigger these two actions and a workflow which assigns the next batch of jobs.
I have a dashboard containing two views:
- SEARCH FORM - A detail view of search form with a few criteria drop down fields.
- SLICED DATA - A table view of a slice of the Locations table. The slice is filtered based on the criteria entered into the search form.
The search form also has a button which sets a DateTime field to Now(). When the DateTime field is updated, this change increments a ChangeCounter field whose Accumulator is set to Reset.
Each update to the search form table triggers a workflow which has a condition that checks the ChangeCounter value. If the condition is true, the workflow will execute an action.
The action executed is set to Data: execute an action on a set of rows. The referenced rows are those that are contained in the slice mentioned above. The purpose of this action is to select the rows of data to execute an action on which will be rows that meet the criteria in the search. The action that is executed, copies fields from these rows to another table. Pretty simple once you understand what’s happening.
The problem is, if a large number of rows are in the slice (400 or so), this can take 5 minutes or longer to fully execute.
I set this up initially about a week ago and while it wasn’t super fast, I don’t recall it being quite this slow. Today, a frozen snail seems like it would be faster…
If you’ve read this far and have some ideas about how I could improve this process, I would love to hear them.