2015-10-22 Marklogic user group

Next Generation Data Models – How Data Point Modeling Really Works – Presentation to Marklogic User Group London 2015-10-22

This presentation to the Marklogic User Group in London will focus on what the DPM is and how to exploit it within a semantic technology enabled environment.


Next Generation Data Models – How Data Point Modeling Really Works – Greg Soulsby, ModelDR

• How to develop a enterprise data model without complex extract, transform and load. You will learn how to kill the need for a data warehouse. You will see us use Open Gamma trading platform as an example of a complex source system.

• Why an enterprise wide data model using industry standards like FIBO and Data Point Modeling create an agile enterprise architecture, using MiFID II Trade Lifecycle reporting as the Use Case.

• How to run real time queries across your entire systems estate.We will use the Large Trade report as the example.

• How to support change, such as new reports, new view points by new actors and changing data schemas.

• Manage data quality by exploiting the Data Point Model approach to concerns such as Bitemporality, Governance, Timeliness, Completeness and Accuracy.

A Turning Point for your Data: how to think about your data in a NoSQL world

Thursday, Oct 22, 2015, 5:00 PM

Etc Venues, Norton Folgate – Liverpool Street,
Bishopsgate Court London, E1 6DQ, GB

45 Members Attending

MarkLogic’s schema-agnostic design lets you load data as-is and use it right away, without having to design a master schema that can accommodate all your data in advance.And you can get great value from MarkLogic without ever doing any data modeling at all. But if you want to get the most out of your data, it helps to understand what’s in it. Much…

Check out this Meetup →

Sustainable compliance with latest financial regulations – Webinar with MarkLogic

Pulling together accurate reports that meet regulators changing demands is a clarion call for bodies and resources to “unwind data” and aggregate its metadata. Failure to do so could mean penalties, fines and loss of trust.

But more resources and endless ETL can actually create more data uncertainties and inaccuracies. In this 60 min webinar ModelDR’s Greg Soulsby and Simon Roberts joins MarkLogic colleagues Chris Atkinson, Rupert Brown and Diane Burley to discuss a new and sustainable data aggregation process.

Designed for data architects and managers, the webinar will equip you for further discussion with interest groups in your organization. You will learn key terms like:

  • semantics
  • data point modeling
  • bitemporal
  • congruent panoply

We hope you can join us for this show dont tell event.

Dodd Frank Record Keeping Rules – a lot of data for banks to untangle!

Dodd Frank Record Keeping Rules – a lot of data for banks to untangle!

Bank systems have thousands of data attributes, delivered by hundreds of internal and external sources, all stored in dozens of unconnected databases. This fragmentation results in a continual challenge of mapping, cross-referencing and manual reconciliation, further exacerbated by the problem of common terms that have different meanings, common meanings that use different terms and vague definitions that are not captured in recognised financial taxonomies.

The Dodd Frank record keeping rules are directed towards full ‘end to end ‘ transaction monitoring and require an unambiguous data management strategy with full data ownership and accountability.

Dodd Frank record keeping requirements can only be met by data architecture that untangles all firms silo based data systems and harmonizes the data aggregation process. Systems must be ‘wired up’ to data architecture, designed in data point model format that has been updated with the latest regulatory taxonomies.

The Data Point Model ‘wiring up’ process is depicted in a static example form below.

ModelDR untangles the data, imports the regulatory taxonomies and builds a new design architecture. It creates a clear and complete view of ‘in house’ data and allows data quality testing consistent with current regulatory standards.

Dodd Frank Data Point Model