IngeniBridge bridges your Time Series Historian Data Lake to intelligence

You Modelize

You begin by modelizing your asset descriptions, relations between asset elements, measures and the carrying assets.

You do not process the model in target within one iteration. No, modelization process is progressive using iterations. You decide how much content within each iteration. Content is generally guided by the functionalities to be delivered to the users into the next version. Deagital will accompany you during this phase with the help of one of its data architects.

You Consolidate

Next, you consolidate the information about assets and measures existing in the enterprise. This information may be found into the Geographic Information System or any inventory medium created by the field people. The information is not yet digitalized ? Then the IT staff will help in choosing any inventory medium such as Excel worksheets. The actual consolidation into the model will be realized by an integration script. Deagital will accompany you during this phase with the help of one of its data architects.

You bring the intelligence into your system to the level of your Smart ambitions

The multidimensional request engine is automatically fueled with all the functional dimensions created from you metamodel. Making your data lake now functionally described into a 360° view in the enterprise. You may now bring intelligence functionalities into your system such as:

- Key performance indicators

- Reportings and balances

- Dashbords contexted on the GIS (geographic system)

- Timed data acquisition governance from the autonomous sensors (IOTs) in the network

- Detecting bad values

- Field operations on assets

- Correlation of threshold values from different points in the network

- And yet more...

Ingenibrige becomes the bridge linking the data stemming from your production tool and driving its operational performance.


How to do it ? Please contact us, we'll propose you to come and make a demonstration of our tools. They come along with a methodology framework and data architecture expertise all within a global and consistent offer.