Data migration Overview
Data Migration is the process of selecting, preparing, extracting, and transforming data, then permanently transferring it from one computer storage system to another. It typically comes into scope for Fenergo Clients when either of the following scenarios apply:
- An existing system is being retired / replaced with the Fenergo SaaS platform and the data on that system is required to support ongoing operations.
- and/or
- The Fenergo SaaS Platform is required to begin on Day 1 of its operations with an initial seed of data to support operations.
Data Migration is not required if the SaaS Platform is being introduced in a Green Field scenario or a client wishes to proceed new operations on the new platform with a focus on only creating new data.
Data Migration Strategy
The end state of a Data Migration is to have existing data from a source client data store, loaded into the Fenergo SaaS platform. The Strategy pursued by clients is often entirely unique, assessed on a client by client basis and that strategy can be based on factors such as:
- Where the data is stored: Clients will need to have the ability to extract the data from its current location.
- The quality / accuracy / completeness of that data: Clients need to be able to analyze the data and determine if any pre-migration data clean-up or remediation is required.
- Compatibility with future data model: The structure and formatting of the existing data may no longer meet business needs and a future data model will potentially contain data elements which do not yet exist or else exist in a legacy structure / format.
- Accessibility of that data: The data could be in use and constantly changing, so the timing of a migration along with any pre-migration activity requires planning to support business operations throughout and after the migration process.
Fenergo technical specialists can advise on planning and preparation when engaging with clients to ensure a compatible strategy aligned to support client operational needs. The FenX platform offers API services which clients can use to create a Data Migration Strategy and use those APIs to perform the migration.
Data Mastering
When assessing client needs, Fenergo do so in the context of a broad technical landscape. Not only is the SaaS platform considered but also the suite of software platforms and which make up a clients full technical landscape.
To ensure data consistency from the perspective of the technical landscape, clients typically adhere to a Data Mastering strategy (may be referred to in other terms). At the simplest level this means one of the systems will be treated as a golden source, or authoritative system of record. Also known as a Data Master. This concept introduces several scenarios which need to be considered from a design perspective when deploying a new platform into a technical landscape and will dictate how interactions between systems should be implemented. Some typical scenarios and their considerations are:
Data Mastering Scenarios
SCENARIO: Fenergo as DATA MASTER
DESIGN CONSIDERATION:
With FenX implemented as a Data Master, client systems within a landscape will need to read from FenX for an authoritative version of a Client Record before making any decisions or performing any actions based on that data set.
If those systems persist local copies of data, they will need to either synchronize those copies in real time or refresh on demand before working on the data.
Once an update or action is performed on a data record, that update needs to be sent to FenX to ensure it has the most up to date version of data.
Data conflict scenarios must be examined with appropriate compensation strategies in place to ensure no dirty reads / writes occur with data.
As an event driven platform, FenX can communicate when changes to the state of data occur and this can be used to inform downstream interested systems than an update to a data record has happened.
OUTCOME:
FenX stores data in two distinct States "Verified" and "Draft". Our perspective is that the verified record is an authoritative source and a draft record represents an "in-flight" not yet approved version of data.
Clients need to examine their use cases to determine which updates are pertinent in which circumstances and make sure they push / pull information at appropriate times.
SCENARIO: Fenergo as DATA SLAVE
DESIGN CONSIDERATION:
With FenX implemented as a Data Slave, FenX will need to read from an authoritative version of a Client Record before preforming any actions based on that data record.
FenX does not directly monitor changes in client systems so clients will need to create a programmatic mechanism to keep identify when data changes in its data master and push that change to FenX / Or potentially do a refresh of data on-demand if it detects that FenX is about to update data (via a journey).
In this scenario FenX versions of Verified Data might not need to maintain full accurate sync with a Data Master because it is not considered a source of truth, instead it can update local data as part of how it creates journeys.
Once FenX has updated data and saved that update to its verified store, it can raise an event which can be used as a trigger to then push that update to Data Master.
Data conflict scenarios will need to be considered and the facilities are in place to cancel journeys or restart tasks in FenX. Analysis would need to be done to ensure such scenarios are catered for.
OUTCOME:
The two states in FenX can be offer a lot of flexibility for Clients to examine their use cases to determine where data refreshes should happen.
As Journeys themselves can be open for extended periods of time, how updates are sequenced will need to be considered and where those updates are applied, i.e. as a journey is created, whilst it is in flight, direct updates to Verified Data etc...
SCENARIO: Fenergo as PARTIAL DATA MASTER (HYBRID)
DESIGN CONSIDERATION:
FenX is legal entity centric, which means it focuses on the data record related to the Legal Entity, KYC/AML related to that Entity. Other client systems might focus (and be a master of) different properties of a record, such as account balances, billing, transaction monitoring etc. Clients may need to derive a way to ensure correctness of parts of a data record across multiple systems.
FenX offers a lot of flexibility around the creation of Policy's (the data) and Journeys (where data is collected) so potentially a strategy to focus on Mastering a subset of data, and operating as a slave to a different set can be put in place.
OUTCOME:
Hybrid strategies can be more complex to support as they often introduce more corner cases or situations where parts of data can fall out of sync.
Need to be careful not to impede operations focusing on one part of a data record because of changes which occur in an unrelated area.