Upload a file content in multiple parts

async AsyncCogniteClient.files.multipart_upload_content_session(
parts: int,
external_id: str | None = None,
instance_id: NodeId | None = None,
) FileMultipartUploadSession

Begin uploading a file in multiple parts whose metadata is already created in CDF.

This allows uploading files larger than 5GiB. Note that the size of each part may not exceed 4000MiB, and the size of each part except the last must be greater than 5MiB.

The file chunks may be uploaded in any order, and in parallel, but the client must ensure that the parts are stored in the correct order by uploading each chunk to the correct upload URL.

This returns a context manager you must enter (using the with keyword), then call upload_part for each part before exiting. It also supports async usage with async with, then calling await upload_part_async.

Parameters:
  • parts (int) – The number of parts to upload, must be between 1 and 250.

  • external_id (str | None) – The external ID provided by the client. Must be unique within the project.

  • instance_id (NodeId | None) – Instance ID of the file.

Returns:

Object containing metadata about the created file, and information needed to upload the file content. Use this object to manage the file upload, and exit it once all parts are uploaded.

Return type:

FileMultipartUploadSession

Examples

Upload binary data in two chunks:

>>> from cognite.client import CogniteClient, AsyncCogniteClient
>>> client = CogniteClient()
>>> # async_client = AsyncCogniteClient()  # another option
>>> with client.files.multipart_upload_content_session(
...     external_id="external-id", parts=2
... ) as session:
...     # Note that the minimum chunk size is 5 MiB.
...     session.upload_part(0, "hello" * 1_200_000)
...     session.upload_part(1, " world")