Complex extraction and transformation of data from two disparate line of business apps, both without APIs, loaded into Oracle NetSuite.
The client, who is a large law firm with offices throughout Australia, approached Wrive to assist with solving a business problem. Having recently deployed Oracle NetSuite as the client was growing and outpacing what their current finance system could offer, they had the challenge of extracting financial data from the current systems and loading this into NetSuite on an ongoing basis.
The two line of business applications in use, one of which was ActionStep (a practice management tool widely used in the legal industry in Australia). Both applications had their own timesheet, billing and CRM modules. The client requirement was to extract the customer, invoice, payment, credit, items and GL codes to post to new ledgers in NetSuite. Asolution was needed to extract the current period's data and future state data on a scheduled basis, in increments of 10 minutes and post to NetSuite, mapping fields from source to destination endpoints, to enable a seamless transition to the new system, while enriching their master data repository.
While NetSuite had a fairly extensive set of API's, the two line of business apps did not. In fact neither had an invoice API, so gaining access to the critical data programmatically was not possible.
To further complicate matters, there was a tertiary system, a Datalake, housed in AWS, that contained a record of master data for client records, which also did not have an API.
As the client had already commenced their migration to Azure, Wrive architected a solution on Azure using serverless compute functions to perform the extraction, manipulation and posting of data (or ETL - extract, transform, load) along with an event fabric for message brokering in a publisher/subscriber model to exchange data between systems to ensure reliable delivery using an Azure Event Grid Topic.
As there were limited or in some instances a complete lack of APIs, Wrive engaged with ActionStep to enable access to a replica of the live data through a database architecture on top of which the Hasura graph API management tool was deployed. This provided an API endpoint for the data and allowed for GraphQL queries, relationships and mutations to be built to query and access the data required for extraction.
For the secondary line of business app, there was no such privileged enjoyed as with ActionStep. Wrive recommended and implemented Microsoft Power Automate, part of the 365 eco system, to automatically login to the SaaS based application and download the datarequiredfrom the UI.
Once thedata was obtained it was then stored in a BLOB Storage Container in Azure and the first of six Azure Functions could be executed to commence transformation and load.
Once the data was accessible to Azure, Wrive developed a series of function apps on the Dot Net framework to perform load and transformation services, mapping the raw data into a JSON Schema.
This schema was then adopted as the organisations gold standard integration schema for any future state automations. The Schema allowed for consistent mapping of fields regardless of the source system, ensuring that future publishers could be added to the Event Fabric with minimal transformation, if any at all. Once the data was transformed and mapped into a state that NetSuite required and Egress Function posted the data to the NetSuite APIs.
In addition to this, as the Dot Net Functions are synchronous and elastic by nature, this presented challenges with API throttling and load. Wrive implemented a solution in code to rate limit the connections and utilise an asynchronous connectivity methodology to ensure that the receiving APIs did not reject or drop any packets. In addition to this, extensive error handling and reporting was enabled to ensure that any messages posted by the egress functions were received by the endpoint and that any errors or connection issues were transmitted to Azure Monitor Application Insights.
Integration with App Insights as the Application and Performance Monitoring (APM) solution formed critical component of the end state solution enabling real-time monitoring of processes, threads, connections, load, performance, errors and retransmission, across the functions and the event fabric.