Insert datapoints into multiple time series
- async AsyncCogniteClient.time_series.data.insert_multiple(
- datapoints: list[dict[str, str | int | list | Datapoints | DatapointsArray | NodeId]],
Insert datapoints into multiple time series.
Timestamps can be represented as milliseconds since epoch or datetime objects. Note that naive datetimes are interpreted to be in the local timezone (not UTC), adhering to Python conventions for datetime handling.
Time series support status codes like Good, Uncertain and Bad. You can read more in the Cognite Data Fusion developer documentation on status codes.
- Parameters:
datapoints (list[dict[str, str | int | list | Datapoints | DatapointsArray | NodeId]]) – The datapoints you wish to insert along with the ids of the time series. See examples below.
Note
All datapoints inserted without a status code (or symbol) is assumed to be good (code 0). To mark a value, pass either the status code (int) or status symbol (str). Only one of code and symbol is required. If both are given, they must match or an API error will be raised.
Datapoints marked bad can take on any of the following values: None (missing), NaN, and +/- Infinity. It is also not restricted by the normal numeric range [-1e100, 1e100] (i.e. can be any valid float64).
Examples
Your datapoints can be a list of dictionaries, each containing datapoints for a different (presumably) time series. These dictionaries must have the key “datapoints” (containing the data) specified as a
Datapointsobject, aDatapointsArrayobject, or list of either tuples (timestamp, value) or dictionaries, {“timestamp”: ts, “value”: value}.When passing tuples, the third element is optional and may contain the status code for the datapoint. To pass by symbol, a dictionary must be used.
>>> from cognite.client import CogniteClient >>> from cognite.client.data_classes.data_modeling import NodeId >>> from cognite.client.data_classes import StatusCode >>> from datetime import datetime, timezone >>> client = CogniteClient() >>> # async_client = AsyncCogniteClient() # another option >>> to_insert = [ ... { ... "id": 1, ... "datapoints": [ ... (datetime(2018, 1, 1, tzinfo=timezone.utc), 1000), ... (datetime(2018, 1, 2, tzinfo=timezone.utc), 2000, StatusCode.Good), ... ], ... }, ... { ... "external_id": "foo", ... "datapoints": [ ... (datetime(2018, 1, 3, tzinfo=timezone.utc), 3000), ... (datetime(2018, 1, 4, tzinfo=timezone.utc), 4000, StatusCode.Uncertain), ... ], ... }, ... { ... "instance_id": NodeId("my-space", "my-ts-xid"), ... "datapoints": [ ... (datetime(2018, 1, 5, tzinfo=timezone.utc), 5000), ... (datetime(2018, 1, 6, tzinfo=timezone.utc), None, StatusCode.Bad), ... ], ... }, ... ]
Passing datapoints using the dictionary format with timestamp given in milliseconds since epoch:
>>> import math >>> to_insert.append( ... { ... "external_id": "bar", ... "datapoints": [ ... {"timestamp": 170000000, "value": 7000}, ... { ... "timestamp": 180000000, ... "value": 8000, ... "status": {"symbol": "Uncertain"}, ... }, ... { ... "timestamp": 190000000, ... "value": None, ... "status": {"code": StatusCode.Bad}, ... }, ... { ... "timestamp": 200000000, ... "value": math.inf, ... "status": {"code": StatusCode.Bad, "symbol": "Bad"}, ... }, ... ], ... } ... )
If the Datapoints or DatapointsArray are fetched with status codes, these will be automatically used in the insert:
>>> data_to_clone = client.time_series.data.retrieve( ... external_id="bar", include_status=True, ignore_bad_datapoints=False ... ) >>> to_insert.append({"external_id": "bar-clone", "datapoints": data_to_clone}) >>> client.time_series.data.insert_multiple(to_insert)