Asset health: doctor blade

Context

Doctor blades are used in printing to remove the excess ink from a cylinder to create a uniform layer of ink.

If the blade is not sharpened correctly, wrongly adjusted, or is damaged, this can result in millions of dollars of asset damage.

In a nutshell, what Quasar delivered

  • Greatly increased accuracy of prediction and models by capturing the full waveform data and performing spectral analysis.
  • A shorter model to production loop that empowers data analystis and reliability engineers
  • 1,000X faster queries compared to a datawarehousing solution while using less cloud resources
  • Storage footprint divided by 10
  • Future proof solution thanks to built-in scalability

Challenge

The blade is changed or adjusted based on a technician's hunch and experience or current practice.

However, collecting the vibration data coming out of the sensors makes it possible to identify common patterns present before problems happen.

The goal is to have a dashboard with precise information regarding the imminence of a problem and its type. In addition, it is possible with this approach to find the optimal time to change the doctor blade, preventing asset damage and optimizing parts inventory.

However, having this level of accuracy requires working on the entire waveform of the vibration data and performing a frequency analysis. Complete waveform data is usually several thousands of points per second per sensor, resulting in massive data sets.

The other challenge is that sometimes problems are identified just minutes before they happen, which means it's imperative to have an end-to-end data latency update below the minute.

That update latency and write volume disqualify data warehousing technology immediately.

An approach that solves the write volume problem is to store the data in files and then have ad-hoc programs perform the feature extraction. This, however, means that each analysis may have a high CPU time cost, and it requires writing new programs if the feature extraction needs to change.

Finally, these programs must be highly efficient to not add to the data latency. If it takes 15 minutes to extract the feature, the information may come too late to take action.

The solution

Using Quasar, firms can capture the complete waveform data using dedicated connectors. The capture can happen at the edge or via MQTT.

Quasar will use its unique algorithm to compress the data faster than it arrives and use its micro-indexes to reduce future lookup and feature extraction time. This results in significantly reduced disk usage without impacting the queryability of the data.

Each sensor is stored in a dedicated table storing timeseries data. Tables can be tagged at will, enabling a flexing querying mechanism based on those tags. For example, obtaining all the sensors of an asset or all the sensors of a mill.

Quasar has built-in feature extraction capabilities that leverage the capabilities of the SIMD capabilities of the processor. It can, in real-time, perform complex statistical analysis, pivot the data, do FFT, and much more.

This means that feature extraction does not require ad-hoc programs and can be tailored by writing a simple SQL query.

Quasar can keep a deep history, meaning that the same system is used for operational reporting and building the strategy. This significantly reduces the time between devising a new approach and deploying it in production.

Benefits

  • Firms can perform advanced data analysis to protect their asset health and optimize their parts inventory, increasing competitiveness
  • Capture the entire waveform without any limitation in history depth, increasing accuracy and thus potential gains
  • Devising and deploying new strategies is extremely fast since Quasar contains both the history and the live data, increasing productivity of rare data scientists resources
  • Extracting feature does not require writing custom programs, increasing flexibility
  • Outstanding TCO: extracting features does not require writing custom programs and uses less CPU. Up to 20X less storage space is needed compared to a typical data

warehousing solution and up to 1,000X faster feature extraction compared to data lakes


Asset Health : Ball Bearing

Context

The papermaking process uses spherical roller bearings. These bearings require proper lubrification and maintenance; failure to do so can result in millions of dollars in damage to assets.

In a nutshell, what Quasar delivered

  • Greatly increased accuracy of prediction and models by capturing the full waveform data and performing spectral analysis.
  • A shorter model to production loop that empowers data analysis and reliability engineers
  • 1,000X faster queries compared to a data warehousing solution while using less cloud resources
  • Storage footprint divided by 10
  • Future proof solution thanks to built-in scalability

Challenge

Technicians need to inspect the machine visually and check its behavior to know if the bearing needs to be changed. When an asset breaks down, it can take a very long time for the technician to understand the root cause.

By collecting the vibration data coming out of the sensors, it’s possible to identify typical patterns before a problem happens. The patterns can also help distinguish between the root cause of the problem.

The goal is to have a dashboard with precise information regarding the imminence of a problem and its type. In addition, it is possible with this approach to anticipate wear and tear long enough to schedule a downtime instead of dealing with a forced interruption.

However, having this level of accuracy requires working on the entire waveform of the vibration data and performing a frequency analysis. Complete waveform data is usually several thousands of points per second per sensor, resulting in massive data sets.

The other challenge is that sometimes problems are identified just minutes before they happen, which means it’s imperative to have an end-to-end data latency update below the minute.

That update latency and write volume disqualify data warehousing technology immediately.

An approach that solves the write volume problem is to store the data in files and then have ad-hoc programs perform the feature extraction. This, however, means that each analysis may have a high CPU time cost, and it requires writing new programs if the feature extraction needs to change.

Finally, these programs must be highly efficient to not add to the data latency. If it takes 15 minutes to extract the feature, the information may come too late to take action.

The Solution

Using Quasar, firms can capture the complete waveform data using dedicated connectors. The capture can happen at the edge or via MQTT.

Quasar will use its unique algorithm to compress the data faster than it arrives and use its micro-indexes to reduce future lookup and feature extraction time. This results in significantly reduced disk usage without impacting the query ability of the data.

Each sensor is stored in a dedicated table storing time series data. Tables can be tagged at will, enabling a flexing querying mechanism based on those tags. For example, obtaining all the sensors of an asset or all the sensors of a mill.

Quasar has built-in feature extraction capabilities that leverage the capabilities of the SIMD capabilities of the processor. It can, in real-time, perform complex statistical analysis, pivot the data, do FFT, and much more.

This means that feature extraction does not require ad-hoc programs and can be tailored by writing a simple SQL query.

Benefits

  • Firms can perform advanced data analysis to protect their asset health and schedule maintenance, resulting in millions of dollars in savings per year
  • Capture the entire waveform without any limitation in history depth, increasing accuracy and thus potential gains
  • Devising and deploying new strategies is extremely fast since Quasar contains both the history and the live data, increasing the productivity of rare data scientist resources
  • Extracting feature does not require writing custom programs, increasing flexibility
  • Outstanding TCO: extracting features does not require writing custom programs and uses less CPU. Up to 20X less storage space is needed compared to a typical data warehousing solution and up to 1,000X faster feature extraction compared to data lakes

ITCH Data: Order Book building

Context

Market data mainly consists of orders and trades.

The order book is necessary to evaluate how orders may be filled and is a powerful tool to identify actors' intentions.

On top of that, reconciling orders with trades is a legal requirement: actors need to conform to best execution.

The order book is the list of buys and sells organized by price levels. It provides valuable trading information, which helps traders and improves market transparency.

In a nutshell, what Quasar delivered

  • Deployment of complete Nasdaq market data capture, from start to finish, in less than a week
  • Instant order book rebuilding at any point in the day while using less cloud resources
  • Storage footprint divided by 10
  • Future proof solution thanks to built-in scalability

Challenge

Nasdaq enables subscribers to track the status of each order from the first time it is entered until it is either executed or canceled via the ITCH protocol. To properly participate, one thus needs software capable of decoding the protocol.

Once the data is decoded, one needs to store that data in a form that can be later processed to rebuild the order book, which means processing every order since the opening of the markets.

Lastly, it is necessary to match orders with executed trade to ensure that best execution has been respected. This means parsing every order and finding the best match at the time of the trade.

Nasdaq data volumes are in the region of 2 TB per day.

These volumes make it impossible to rely on standard data warehousing technologies.

Typically, firms will store data in files processed by custom-made software for historical data analysis and keep one or two days of data in a dedicated operational database.

The solution

Quasar has an ITCH data importer that can decode and capture the data at very high speed without any intermediary conversion. This ensures data capture is as fast and efficient as possible.

Data is compressed using Quasar optimized compression algorithm, significantly reducing the disk footprint of the history.

Market data is organized in the following way: two (2) tables representing timeseries data per security, one (1) for the trades, one (1) for the orders. Using tagging, it's possible to group different tables for any query. This organization that is made possible thanks to Quasar tagging capability increases compression efficiency and significantly boosts querying speed.

Benefits

  • Equity trading firms get instant vision on the order book and best execution without relying on custom development or third-party tools, increasing alpha
  • All market data (history and current day) is contained in the same system, significantly reducing errors and increasing productivity and, thus, competitiveness
  • Compatibility with all standard analytical tool suites for maximum flexibility
  • Outstanding TCO thanks to the combination of data compression and rapid order book rebuilding, saving on disk storage and CPU power