1

Exporting Large Volumes of Raw Historian Data Efficiently via Canary API

I'm working on exporting a large amount of raw historian data using the Canary API (/getData2) over a long time range. Due to the tag count limitations per API call, the number of calls required is quite high—making the overall process very slow.

While I’m considering parallelizing the API requests, I’m a bit concerned about whether the on-premises server running Canary can handle the additional load from the Python script running alongside it.

To reduce the export runtime, I’m exploring the following options:

  1. Increase the interval per API call – Instead of requesting one day of data per tag per call, I could request larger time spans.

  2. Parallelize API requests – Distribute the calls across multiple threads or processes.

  3. Combine both approaches – Increase time intervals and run requests in parallel.

While I've gone through this post, which is very helpful, my main issue is to not throttle

Canary on the production server. 

 

I'd much appreciate any suggestions or experiences others might have with something similar to this.

Reply

null

Content aside

print this pagePrint this page
  • 1 Likes
  • 6 days agoLast active
  • 15Views
  • 2 Following