You're Invited!

Please join us 4/16 for a virtual event to hear speakers experiences at Doordash, LiveRamp and Clearwater Analytics.

Register now

Automatic Error Reporting

You can follow along with this article through our Feature Spotlight video.

Being able to monitor, audit, control and oversee data in all stages is an essential tool for minimizing data cost while maximizing data value. Setting up flows to automatically retry errors in data validation and then send notifications promptly to enable faster data validation should be a fundamental part of data governance. This is an example of how it can be done. Follow along with our video example above.

There are three main aspects to making it easy to manage and monitor even at production scale for large companies:

  1. Global Notifications and Monitoring
  2. Flow Level Errors and Notifications
  3. Validation Rules and Filters

Let’s take a deeper look at each of these aspects.

1. Global Notifications and Monitoring [0:54]

Being able to see errors at a glance is a central part of error reporting. Nexla’s dashboard shows daily read/write status, any current errors, and the overall health of data flows.

From this view, you can easily click into any specific set, source, or synch to analyze different parts of the flows or find and address specific errors. You can also easily set global notification settings to send error notifications to the app, email, Slack channels, or webhooks.

These error notifications can also be exported and summarized to enterprise monitoring tools such as Datadog or PagerDuty. Data Alerts can also be set up and configured for notifications of data changes, schema changes or errors.

By being able to see all current errors at a glance, error handling becomes faster, and with auto notifications, downtime can be significantly reduced.

2. Flow Level Errors and Notifications [1:56]

Errors can also be checked in specific flows and individual data sets. For example, you can click into any data flow to see where an error is occurring specifically. In any data flow, you can click into specific data synchs.

Then click further into synch to see where error is occurring–in this case, the data is failing to upload into the source.

No matter the cause, whether an operations error, a server dropping, or any other cause, it will be reported and documented. The error messages can then be exported in bulk to triage errors.

Nexla will automatically retry in case the disruption is temporary and then quarantine any records that fail the process without disrupting flow. The error message will display next to the records process.

Being able to accurately and rapidly diagnose and determine what is failing, where, and why is critical when dealing with with billions or trillions of records.

3. Flow Level Errors and Notifications [3:04]

Validation is also an important part of data monitoring, and Nexla makes it easy to set any specific validations to ensure data is monitored as necessary.

This quick demo will demonstrate the process; follow along with the video above.

 

Nexla makes it easy to set validations and individual rules on flows and data sets to make sure they’re running correctly.

Select the flow and click into the Transform scree, then select the inputs you want to validate.

Let’s say email is a required field in this example, so we will set it to be complete.

We will add a secondary validation to make sure the email looks like an email string.

We’ll also set the ID to validate for a positive number.

Now that rule is set, it will monitor and validate and notify when data does not pass it. Then you can go into an individual level to triage and address any errors that may occur. When an error in validation occurs, it will be visible from the flow…

 

…and the report on the Error Data tab of the transform screen will list each error…

…as well as why they occurred.

Once errors are detected, you can set up an error data export directly in the tool. Simply click on the link and set a destination.

With automatic notification and easy reporting, Nexla makes error management a fast and simple process.

Conclusion

Keeping your data flows up and accurate is an essential part of data monitoring and data governance. Being able to set global notifications and batch export error files prevents downtime, and data validation not only improves current data usage but also increases ingestion quality.

If you’re ready to discuss thorough governance and error reporting as part of a unified modern data solution, get a demo or book your free data mesh consultation today and learn how much more your data can do when everyone can use it securely. For more on data, check out the other articles on Nexla’s blog.

Unify your data operations today!

Discover how Nexla’s powerful data operations can put an end to your data challenges with our free demo.