How to on Best Practices for Designing Flows in Fuuz

How to on Best Practices for Designing Flows in Fuuz

About

The following list of standards is a listing of requirements that must be completed prior to deploying a Data Flow in Fuuz.  
Violations are ok while working on flows, just resolve them all prior to requesting a production deployment.

The intent of all these standards is to:
  1. Improve the consistency of flow designs
  2. Provide inline documentation for future reference
  3. Increase the efficiency of support by others that haven't been involved
  4. Track all changes for visibility

Naming

Naming the flow is super important the format is as follows:
{#1 Primary Fuuz Object} {#2 direction or verb(s)} {#3 External Platform} {#4 Name of Primary External Object}{#5 Event Type}
  1. The primary Fuuz Object data is being pulled from or pushed into
  2. "From", "To", or verbs like "Badge In/Out" or "Workcenter Mode on Data Change"
  3. "NetSuite", "Plex", "Plex (API)", "QuickBooks", "RedZone", etc. (Note: Plex and Plex (API) are very different sources)
  4. "Bin", "Location", "Ledger" (This is only required if the Fuuz Object is not the same
  5.  "on Data Change" (This is the only event type that is currently required)

Examples:
  1. Locations From NetSuite Bins (This represents a standard integration that sends locations to Fuuz from NetSuite's list of Bins)
  2. Locations To NetSuite Bins on Data Change (This represents a standard integration that sends locations to NetSuite's list of Bins from Fuuz when updates are captured with data change events)
  3. Facilities from NetSuite Locations (This is another integration from NetSuite and highlights that our objects don't line up with NetSuite)
  4. WorkOrder Process Defaults From Fuuz Default Product Strategy Processes on Data Change (This flow prepoulates WorkOrder Process records with a copy from the default Product Strategy Process tables on a data change event specifically the "add" event)
Note: Renaming a flow doesn't change the flow's ID.  If you need to rename the flow you will need to export the contents into an empty flow with the new name.

Documentation

  1. Every flow needs a meaningful description of what it does.  This includes connections used and an overall summary.
  2. Every version needs a description of the changes being made.  If you make additional changes after setting the initial description please update it.
  3. Every node needs a meaningful description.  Period.  No exceptions.
  4. Every flow must have a walkthrough.
  5. Every walkthrough must include all nodes in logical order.
  6. This walkthrough will be used during the review and deployment process.
  7. Take Note: The configured walkthrough is stored directly inside the flow.  You must save the flow after configuring the walkthrough.

General

  1. All nodes must have a unique and meaningful name (Think about error log messages in the future)
  2. Nodes cannot use the default name even if that name is unique
  3. All nodes with a connection on the left-hand side must be connected to another node

Debug Nodes

  1. Descriptions are important.  Use them to highlight specific test cases
  2. Test cases are required to prove your testing prior to deployment

Conditionals

  1. Conditionals should be very descriptive
  2. Conditionals cannot use double negatives (ex: $not(false))
  3. It is strongly recommended to use If/Else in most situations, even if the else condition goes nowhere. 
    1. (Note: If this recommendation results in a double negative to continue on a true path, consider using the else path instead.)

Context

  1. Label everything in context with a wrapping variable, never set or merge context with a transform that is just "$", instead use "{plexResponse:$}"
  2. If your context has a lot of unnecessary data prune it prior to storage.
  3. If your context is large and is no longer needed remove it.
  4. Warning: Context is shared with each node downstream.  Forks and broadcasts will significantly increase memory usage by a factor equal to the number of concurrent transactions.  It is very important to keep a clean context with only what you need during these situations. 

Flow Control

  1. Broadcasts are dangerous, in most cases there is another way to solve the problem
  2. Keep in mind the nesting limitations of a flow, broadcasting more broadcasts quickly burns up your resources and will cause errors

Events

  1. All flows with a Request node must have at least one response Node
  2. Data Changes can trigger a lot of flows, they should be followed immediately by a conditional statement to reduce executions
  3. Topics, publishing, and subscribing are the preferred method of handling pagination of large data sets

Integration

IIOT

  1. Communication with the Device Gateway can be time consuming, it is recommend that tag reads triggered by a flow read multiple tags at once.
    • Related Articles

    • How to Create APIs Using Data Flows in Fuuz

      How to Create APIs Using Data Flows in Fuuz Estimated Time: 5–7 minutes Skill Level: Intermediate Overview This guide explains how to create RESTful APIs using Data Flows in Fuuz. Unlike the GraphQL-based method covered in a previous tutorial, this ...
    • How To General E-Commerce Integrations using Data Flows in Fuuz iPaaS

      In this recording, one of our engineers provides guidance on some practices regarding integrating your E-Commerce solution with an ERP system. This is a guide, not necessarily best practices as your specific requirements maybe different - this video ...
    • Python and Fuuz Data Flows

      Fuuz does not natively support Python Scripting within our framework. However, you can execute remote python scripts using our RestAPI integrations Continue reading for a better understanding of why we have opted to not support Python natively within ...
    • How to Integrate with an HR system like ADP using Fuuz iPaaS

      In this video, our engineer walks you through the basic process of getting started with an integration to your HR / payroll provider system. In this case we are demonstrating this using the ADP Restful API interface. These concepts apply to any sort ...
    • Debugging and Testing our Fuuz Data Flows

      This video covers concepts related to using the debugging nodes within a data flow. Additional enhancements have been added since this video, however, this gives you a good baseline understand of the most simple/common methods of testing and ...