Most efficient way to export large volumes of data with API
I’m working with a customer who needs about six months of historical data at a 10-second resolution for approximately 1,200 tags. I'm writing a script using continuation to retrieve the data, but I’d like advice on the most efficient approach from a performance and time standpoint.
Specifically, I’m wondering about:
Whether to use
getTagData
orgetTagData2
Optimal
maxSize
settingQuerying one tag at a time vs. multiple tags in a batch
Querying the entire range at once vs. splitting the range into chunks (e.g., one day at a time)
1 reply
-
Hi ,
You would definitely want to use /getTagData2 over /getTagData as that will allow you to take advantage of a larger max return size (which was the reason that function was updated in the first place). The Max Tvqs Per Request setting in Views>Configuration>Settings controls how many data items are returned in a single request across ALL tags. The default setting is 1 million, as opposed to 10,000 which was the limit in /getTagData.
When it comes to querying the tags, I don't know if you will see a significant difference in performance if you do it per tag or in batches. If you do it per tag, you'd be opening/reading from multiple files as you'd be able to get more updates from a single tag than multiple, whereas if you made a request for a small chunk of history across all tags, you may only be opening one file. Your preference may depend more on how you are going to consume it after it has all been exported.
If you do it in time range batches for ALL tags, as long as you stay under the Max Tvqs Per Request limit, you wouldn't need to worry about passing in maxSize and continuation parameters.