Manufacturing Process Historian

Case Study


Manufacturing plants use historians, such as GE Historian, to display process data, such as energy consumption, on dashboards or inject them into analytical tools to create models.

These models help to analyze the various operating conditions, for example, find patterns of energy consumption correlated with known incidents of product defects.

In a nutshell, what Quasar delivered

  • Completely removed all limits imposed by historians in terms of history depth, data volume, and query complexity
  • Virtually unlimited history
  • No change in the reporting tool used, Quasar replaces the backend transparently
  • Storage footprint divided by 20
  • Future proof solution thanks to built-in scalability


Historians struggle quickly when the depth of history rises. To build an accurate model, one may need to pull a year of more than data; these represent data volumes historians cannot manage, even for smaller plants.

That means that users need to choose between history depth and sampling accuracy to stay within the constraints of the historian.

That makes it very difficult for data scientists to do a detailed analysis of energy consumption, resulting in inaccurate or incorrect models.

The solution

Quasar is connected to the historian with its high-speed ingestion API to capture all the data as it comes, without any loss of quality or precision.

Quasar will use its unique algorithm to compress the data faster than it arrives and use its micro-indexes to reduce future lookup and feature extraction time. This results in significantly reduced disk usage without impacting the queryability of the data.

Each sensor is stored in a dedicated table storing timeseries data. Tables can be tagged at will, enabling a flexing querying mechanism based on those tags. For example, obtaining all the sensors of an asset or in a specific area of the plant.

Once Quasar collects all the information, it will serve all queries at the highest resolution possible.

Energy data, for example, can be shown at the highest resolution, something it was not possible to do with a historian without compromising dynamic response.


  • Build accurate models to improve process quality
  • Complete data precision to improve the accuracy of the models
  • Circumvent historian limitations without replacing them, minimizing IT infrastructure disruption while enabling next-generation AI
  • Outstanding TCO: extracting features does not require writing custom programs and uses less CPU. Up to 20X less storage space is needed compared to a typical data warehousing solution and up to 1,000X faster feature extraction compared to data lakes

Related Tags: