Data Flow Design Standards

Data Flow Design Standards

This article provides the information and resources to support the task of deploying a data flow in Fuuz.

The standards and requirements must be completed prior to deploying a Data Flow in Fuuz. 

Depending on the version, most of these validations may be automatically checked and prevent deployments.

Violations of the standards and requirements are acceptable while working on flows. Always resolve violations before requesting a production deployment.

  1. Purpose
  2. Naming
    1. Examples
  3. Documentation
  4. General
  5. Debug Nodes
  6. Conditionals
  7. Context
  8. Flow Control
  9. Events
  10. Integration
  11. IIOT

Purpose

These standards are established to:

  • Improve the consistency of flow designs

  • Provide inline documentation for future reference

  • Increase the efficiency of support by others

  • Track all changes for visibility

Naming

Properly naming the flow is vital.

Use the following format:

  • {#1 Primary Fuuz Object} {#2 direction or verb(s)} {#3 External Platform} {#4 Name of Primary External Object}{#5 Event Type}

    • The primary Fuuz Object data is being pulled from or pushed into

    • FromTo, or verbs like Badge In/Out or Workcenter Mode on Data Change

    • NetSuitePlexPlex (API)QuickBooksRedZone, etc.

      • Plex and Plex (API) are very different sources

    • BinLocationLedger

      • This is only required if the Fuuz Object is not the same

    • on Data Change

      • This is the only event type that is currently required

Examples

  • Locations From NetSuite Bins

    • This represents a standard integration that sends locations to Fuuz from NetSuite's list of Bins

  • Locations To NetSuite Bins on Data Change

    • This represents a standard integration that sends locations to NetSuite's list of Bins from Fuuz when updates are captured with data change events

  • Facilities from NetSuite Locations

    • This is another integration from NetSuite and highlights that our objects do not line up with NetSuite

  • WorkOrder Process Defaults From Fuuz Default Product Strategy Processes on Data Change

    • This flow prepopulates the WorkOrder Process records with a copy from the default Product Strategy Process tables on a data change event, specifically the add event)

Renaming a flow doesn't change the flow's ID.  If you need to rename the flow you will need to export the contents into an empty flow with the new name.

Documentation

Every flow must have a meaningful description of what it does, including the connections used and an overall summary.

Every version must have a description of the changes being made. 

  • When making additional changes after setting the initial description, update it.

Every node must have a meaningful description.

  • There are no exceptions!

Every flow must have a walkthrough.

Every walkthrough must include all nodes in a logical order.

The walkthrough is used during the review and deployment process.

The configured walkthrough is stored directly inside the flow. Save the flow after configuring the walkthrough.

General

All nodes must have a unique and meaningful name (error log messages in the future must be useful).

Nodes cannot use the default name even if that name is unique.

All nodes with a connection on the left side must be connected to another node.

Debug Nodes

Descriptions are important; use them to highlight specific test cases.

Test cases are required to prove the testing prior to deployment.

Conditionals

Be descriptive in the Conditionals.

Avoid double negatives (example, $not(false)) in the Conditionals.

Use If/Else in most situations, even if the else condition goes nowhere. 

If the If/Else recommendation results in a double negative to continue on a true path, consider using the else path instead.

Context

Label everything in context with a wrapping variable.

  • Never set or merge context with a transform that is just "$".

  • Use "{plexResponse:$}" instead of “$”.

If the context has unnecessary data, prune it prior to storage.

If the context is large, and no longer needed, remove it.

 

Warning: Context is shared with each node downstream.  Forks and broadcasts will significantly increase memory usage by a factor equal to the number of concurrent transactions.  It is very important to keep a clean context with only what you need during these situations. 

Flow Control

Broadcasts are dangerous. In most cases, there is another way to solve the problem.

Due to the nesting limitations of a flow, broadcasting quickly consumes resources and will cause errors.

Events

All flows with a Request node must have at least one response Node.

Data Changes can trigger a lot of flows, they should be followed immediately by a conditional statement to reduce executions.

Topics, publishing, and subscribing are the preferred methods of handling pagination for large data sets.

Integration

The name of the node must indicate what it is calling.

  • Example: "Calling Plex DS - POs_Get (key:12345)"

The description of the node must include the parameters and a description of the intent.

  • Example: "Pulling POs with status of Open"

Use of $integrate() is highly discouraged in transforms

  • It will most likely be deprecated

  • The intent is to make all outbound calls pink in color for easier reference

It is highly recommended that a native node is used over the HTTP node. 

  • Example: Avoid using HTTP to call SalesForce unless the SalesForce node does not meet the need

IIOT

Communication with the Device Gateway can be time consuming. It is a Best Practice to verify the tag reads (triggered by a flow) read multiple tags at once.

    • Related Articles

    • Data Flow Event Nodes

      Open the Fuuz app. Select Fuuz → System → Orchestration → Data Flow Event Nodes dropdown menu. Or Type Data into the Search Bar: From here, is possible to perform the following task or tasks. Search for a data flow event node. Create a data flow ...
    • Getting Started With Data Flow Nodes

      This article provides the information and resources to support the task of working with data flow nodes. The data flow system supports a wide variety of nodes, with more being added regularly. We always welcome feedback or ideas on new nodes that ...
    • Getting Started With Flow Designer

      This article provides the information and resources to support the task of using the flow designer. What Is A Data Flow? Data Flow Designer Toolbox Diagram Editor Panel Console Deploying Flows Importing and Exporting Flows Flow Designer Settings Node ...
    • Orchestration

      Open the Fuuz app. Select Fuuz → System → Orchestration dropdown menu.
    • Data Flow Models

      Open the Fuuz app. Select Fuuz → System → Orchestration → Data Flow Models dropdown menu. Or type Data Flow into the Search Bar: It is possible to perform the following task or tasks. Search for a data flow model. View the Model Info of a node. ...