This article provides the information and resources to support the task of deploying a data flow in Fuuz.
The standards and requirements must be completed prior to deploying a Data Flow in Fuuz.
Depending on the version, most of these validations may be automatically checked and prevent deployments.
Violations of the standards and requirements are acceptable while working on flows. Always resolve violations before requesting a production deployment.
These standards are established to:
Improve the consistency of flow designs
Provide inline documentation for future reference
Increase the efficiency of support by others
Track all changes for visibility
Properly naming the flow is vital.
Use the following format:
{#1 Primary Fuuz Object} {#2 direction or verb(s)} {#3 External Platform} {#4 Name of Primary External Object}{#5 Event Type}
The primary Fuuz Object data is being pulled from or pushed into
From, To, or verbs like Badge In/Out or Workcenter Mode on Data Change
NetSuite, Plex, Plex (API), QuickBooks, RedZone, etc.
Plex and Plex (API) are very different sources
Bin, Location, Ledger
This is only required if the Fuuz Object is not the same
on Data Change
This is the only event type that is currently required
Locations From NetSuite Bins
This represents a standard integration that sends locations to Fuuz from NetSuite's list of Bins
Locations To NetSuite Bins on Data Change
This represents a standard integration that sends locations to NetSuite's list of Bins from Fuuz when updates are captured with data change events
Facilities from NetSuite Locations
This is another integration from NetSuite and highlights that our objects do not line up with NetSuite
WorkOrder Process Defaults From Fuuz Default Product Strategy Processes on Data Change
This flow prepopulates the WorkOrder Process records with a copy from the default Product Strategy Process tables on a data change event, specifically the add event)
Renaming a flow doesn't change the flow's ID. If you need to rename the flow you will need to export the contents into an empty flow with the new name.
Every flow must have a meaningful description of what it does, including the connections used and an overall summary.
Every version must have a description of the changes being made.
When making additional changes after setting the initial description, update it.
Every node must have a meaningful description.
There are no exceptions!
Every flow must have a walkthrough.
Every walkthrough must include all nodes in a logical order.
The walkthrough is used during the review and deployment process.
The configured walkthrough is stored directly inside the flow. Save the flow after configuring the walkthrough.
All nodes must have a unique and meaningful name (error log messages in the future must be useful).
Nodes cannot use the default name even if that name is unique.
All nodes with a connection on the left side must be connected to another node.
Descriptions are important; use them to highlight specific test cases.
Test cases are required to prove the testing prior to deployment.
Be descriptive in the Conditionals.
Avoid double negatives (example, $not(false)) in the Conditionals.
Use If/Else in most situations, even if the else condition goes nowhere.
If the If/Else recommendation results in a double negative to continue on a true path, consider using the else path instead.
Label everything in context with a wrapping variable.
Never set or merge context with a transform that is just "$".
Use "{plexResponse:$}" instead of “$”.
If the context has unnecessary data, prune it prior to storage.
If the context is large, and no longer needed, remove it.
Warning: Context is shared with each node downstream. Forks and broadcasts will significantly increase memory usage by a factor equal to the number of concurrent transactions. It is very important to keep a clean context with only what you need during these situations.
Broadcasts are dangerous. In most cases, there is another way to solve the problem.
Due to the nesting limitations of a flow, broadcasting quickly consumes resources and will cause errors.
All flows with a Request node must have at least one response Node.
Data Changes can trigger a lot of flows, they should be followed immediately by a conditional statement to reduce executions.
Topics, publishing, and subscribing are the preferred methods of handling pagination for large data sets.
The name of the node must indicate what it is calling.
Example: "Calling Plex DS - POs_Get (key:12345)"
The description of the node must include the parameters and a description of the intent.
Example: "Pulling POs with status of Open"
Use of $integrate() is highly discouraged in transforms
It will most likely be deprecated
The intent is to make all outbound calls pink in color for easier reference
It is highly recommended that a native node is used over the HTTP node.
Example: Avoid using HTTP to call SalesForce unless the SalesForce node does not meet the need
Communication with the Device Gateway can be time consuming. It is a Best Practice to verify the tag reads (triggered by a flow) read multiple tags at once.