Skip to content

core ¤

Asset dataclass ¤

Asset(
    rid: str,
    name: str,
    description: str | None,
    properties: Mapping[str, str],
    labels: Sequence[str],
    _clients: _Clients,
)

Bases: HasRid

nominal_url property ¤

nominal_url: str

Returns a link to the page for this Asset in the Nominal app

add_attachments ¤

add_attachments(
    attachments: Iterable[Attachment] | Iterable[str],
) -> None

Add attachments that have already been uploaded to this asset.

attachments can be Attachment instances, or attachment RIDs.

add_connection ¤

add_connection(
    data_scope_name: str,
    connection: Connection | str,
    *,
    series_tags: Mapping[str, str] | None = None
) -> None

Add a connection to this asset.

Data_scope_name maps "data scope name" (the name within the asset) to a Connection (or connection rid). The same type of connection should use the same data scope name across assets, since checklists and templates use data scope names to reference connections.

Parameters:

  • data_scope_name ¤

    (str) –

    logical name for the data scope within the asset

  • connection ¤

    (Connection | str) –

    connection to add to the asset

  • series_tags ¤

    (Mapping[str, str] | None, default: None ) –

    Key-value tags to pre-filter the connection with before adding to the asset.

add_dataset ¤

add_dataset(
    data_scope_name: str,
    dataset: Dataset | str,
    *,
    series_tags: Mapping[str, str] | None = None
) -> None

Add a dataset to this asset.

Assets map "data_scope_name" (their name within the asset) to a Dataset (or dataset rid). The same type of datasets should use the same data scope name across assets, since checklists and templates use data scope names to reference datasets.

Parameters:

  • data_scope_name ¤

    (str) –

    logical name for the data scope within the asset

  • dataset ¤

    (Dataset | str) –

    dataset to add to the asset

  • series_tags ¤

    (Mapping[str, str] | None, default: None ) –

    Key-value tags to pre-filter the dataset with before adding to the asset.

add_log_set ¤

add_log_set(data_scope_name: str, log_set: LogSet | str) -> None

Add a log set to this asset.

Log sets map "ref names" (their name within the run) to a Log set (or log set rid).

add_video ¤

add_video(data_scope_name: str, video: Video | str) -> None

Add a video to this asset.

Assets map "data_scope_name" (name within the asset for the data) to a Video (or a video rid). The same type of videos (e.g., files from a given camera) should use the same data scope name across assets, since checklists and templates use data scope names to reference videos.

archive ¤

archive() -> None

Archive this asset. Archived assets are not deleted, but are hidden from the UI.

get_connection ¤

get_connection(data_scope_name: str) -> Connection

Retrieve a connection by data scope name, or raise ValueError if one is not found.

get_data_scope ¤

get_data_scope(data_scope_name: str) -> ScopeType

Retrieve a datascope by data scope name, or raise ValueError if one is not found.

get_dataset ¤

get_dataset(data_scope_name: str) -> Dataset

Retrieve a dataset by data scope name, or raise ValueError if one is not found.

get_or_create_dataset ¤

get_or_create_dataset(
    data_scope_name: str,
    *,
    name: str | None = None,
    description: str | None = None,
    labels: Sequence[str] = (),
    properties: Mapping[str, str] | None = None
) -> Dataset

Retrieve a dataset by data scope name, or create a new one if it does not exist.

get_video ¤

get_video(data_scope_name: str) -> Video

Retrieve a video by data scope name, or raise ValueError if one is not found.

list_connections ¤

list_connections() -> Sequence[tuple[str, Connection]]

List the connections associated with this asset. Returns (data_scope_name, connection) pairs for each connection.

list_data_scopes ¤

list_data_scopes() -> Sequence[tuple[str, ScopeType]]

List scopes associated with this asset. Returns (data_scope_name, scope) pairs, where scope can be a dataset, connection, video, or logset.

list_datasets ¤

list_datasets() -> Sequence[tuple[str, Dataset]]

List the datasets associated with this asset. Returns (data_scope_name, dataset) pairs for each dataset.

list_logsets ¤

list_logsets() -> Sequence[tuple[str, LogSet]]

List the logsets associated with this asset. Returns (data_scope_name, logset) pairs for each logset.

list_videos ¤

list_videos() -> Sequence[tuple[str, Video]]

List the videos associated with this asset. Returns (data_scope_name, dataset) pairs for each video.

remove_attachments ¤

remove_attachments(
    attachments: Iterable[Attachment] | Iterable[str],
) -> None

Remove attachments from this asset. Does not remove the attachments from Nominal.

attachments can be Attachment instances, or attachment RIDs.

remove_data_scopes ¤

remove_data_scopes(
    *,
    names: Sequence[str] | None = None,
    scopes: Sequence[ScopeType | str] | None = None
) -> None

Remove data scopes from this asset.

names are scope names. scopes are rids or scope objects.

unarchive ¤

unarchive() -> None

Unarchive this asset, allowing it to be viewed in the UI.

update ¤

update(
    *,
    name: str | None = None,
    description: str | None = None,
    properties: Mapping[str, str] | None = None,
    labels: Sequence[str] | None = None,
    links: Sequence[str] | Sequence[Link] | None = None
) -> Self

Replace asset metadata. Updates the current instance, and returns it. Only the metadata passed in will be replaced, the rest will remain untouched.

Links can be URLs or tuples of (URL, name).

Note: This replaces the metadata rather than appending it. To append to labels or properties, merge them before calling this method. E.g.:

new_labels = ["new-label-a", "new-label-b"]
for old_label in asset.labels:
    new_labels.append(old_label)
asset = asset.update(labels=new_labels)

Attachment dataclass ¤

Attachment(
    rid: str,
    name: str,
    description: str,
    properties: Mapping[str, str],
    labels: Sequence[str],
    _clients: _Clients,
)

Bases: HasRid

archive ¤

archive() -> None

Archive this attachment. Archived attachments are not deleted, but are hidden from the UI.

get_contents ¤

get_contents() -> BinaryIO

Retrieve the contents of this attachment. Returns a file-like object in binary mode for reading.

unarchive ¤

unarchive() -> None

Unarchive this attachment, allowing it to be viewed in the UI.

update ¤

update(
    *,
    name: str | None = None,
    description: str | None = None,
    properties: Mapping[str, str] | None = None,
    labels: Sequence[str] | None = None
) -> Self

Replace attachment metadata. Updates the current instance, and returns it.

Only the metadata passed in will be replaced, the rest will remain untouched.

Note: This replaces the metadata rather than appending it. To append to labels or properties, merge them before calling this method. E.g.:

new_labels = ["new-label-a", "new-label-b", *attachment.labels]
attachment = attachment.update(labels=new_labels)

write ¤

write(path: Path, mkdir: bool = True) -> None

Write an attachment to the filesystem.

path should be the path you want to save to, i.e. a file, not a directory.

Channel dataclass ¤

Channel(
    name: str,
    data_source: str,
    data_type: ChannelDataType | None,
    unit: str | None,
    description: str | None,
    _clients: _Clients,
    _rid: str,
)

Metadata for working with channels.

rid property ¤

rid: str

Get the rid value with a deprecation warning.

update ¤

update(
    *,
    description: str | None = None,
    unit: UnitLike | _NotProvided = _NotProvided()
) -> Self

Replace channel metadata within Nominal, and updates / returns the local instance.

Only the metadata passed in will be replaced, the rest will remain untouched.

Parameters:

  • description ¤

    (str | None, default: None ) –

    Human-readable description of data within the channel

  • unit ¤

    (UnitLike | _NotProvided, default: _NotProvided() ) –

    Unit symbol to apply to the channel. If unit is a string or a Unit, this will update the unit symbol for the channel. If unit is None, this will clear the unit symbol for the channel. If not provided (or _NotProvided), this will leave the unit unaffected. NOTE: this is in contrast to other fields in other update() calls where None is treated as a "no-op".

Checklist dataclass ¤

Checklist(
    rid: str,
    name: str,
    description: str,
    properties: Mapping[str, str],
    labels: Sequence[str],
    _clients: _Clients,
)

Bases: HasRid

archive ¤

archive() -> None

Archive this checklist. Archived checklists are not deleted, but are hidden from the UI.

execute ¤

execute(run: Run | str, commit: str | None = None) -> DataReview

Execute a checklist against a run.

Parameters:

  • run ¤

    (Run | str) –

    Run (or its rid) to execute the checklist against

  • commit ¤

    (str | None, default: None ) –

    Commit hash of the version of the checklist to run, or None for the latest version

Returns:

  • DataReview

    Created datareview for the checklist execution

execute_streaming ¤

execute_streaming(
    assets: Sequence[Asset | str],
    integration_rids: Sequence[str],
    *,
    evaluation_delay: timedelta = timedelta(),
    recovery_delay: timedelta = timedelta(seconds=15)
) -> None

Execute the checklist for the given assets. - assets: Can be Asset instances, or Asset RIDs. - integration_rids: Checklist violations will be sent to the specified integrations. At least one integration must be specified. See https://app.gov.nominal.io/settings/integrations for a list of available integrations. - evaluation_delay: Delays the evaluation of the streaming checklist. This is useful for when data is delayed. - recovery_delay: Specifies the minimum amount of time that must pass before a check can recover from a failure. Minimum value is 15 seconds.

reload_streaming ¤

reload_streaming() -> None

Reload the checklist.

stop_streaming ¤

stop_streaming() -> None

Stop the checklist.

stop_streaming_for_assets ¤

stop_streaming_for_assets(assets: Sequence[Asset | str]) -> None

Stop the checklist for the given assets.

unarchive ¤

unarchive() -> None

Unarchive this checklist, allowing it to be viewed in the UI.

Connection dataclass ¤

Connection(
    rid: str, _clients: _Clients, name: str, description: str | None
)

Bases: DataSource

archive ¤

archive() -> None

Archive this connection. Archived connections are not deleted, but are hidden from the UI.

get_channels ¤

get_channels(
    *, names: Iterable[str] | None = None
) -> Iterable[Channel]

Look up the metadata for all matching channels associated with this datasource


names: List of channel names to look up metadata for.

Yields a sequence of channel metadata objects which match the provided query parameters

get_write_stream ¤

get_write_stream(
    batch_size: int = 50000,
    max_wait: timedelta = timedelta(seconds=1),
    data_format: Literal["json", "protobuf", "experimental"] = "json",
) -> WriteStreamBase

Stream to write messages to a datasource.

Messages are written asynchronously.


batch_size: How big the batch can get before writing to Nominal.
max_wait: How long a batch can exist before being flushed to Nominal.
data_format: Serialized data format to use during upload.
    NOTE: selecting 'protobuf' or 'experimental' requires that `nominal` was installed
          with `protos` extras.

Write stream object configured to send data to nominal. This may be used as a context manager
(so that resources are automatically released upon exiting the context), or if not used as a context
manager, should be explicitly `close()`-ed once no longer needed.

search_channels ¤

search_channels(
    exact_match: Sequence[str] = (), fuzzy_search_text: str = ""
) -> Iterable[Channel]

Look up channels associated with a datasource.

Parameters:

  • exact_match ¤

    (Sequence[str], default: () ) –

    Filter the returned channels to those whose names match all provided strings (case insensitive).

  • fuzzy_search_text ¤

    (str, default: '' ) –

    Filters the returned channels to those whose names fuzzily match the provided string.

Yields:

set_channel_prefix_tree ¤

set_channel_prefix_tree(delimiter: str = '.') -> None

Index channels hierarchically by a given delimiter.

Primarily, the result of this operation is to prompt the frontend to represent channels in a tree-like manner that allows folding channels by common roots.

set_channel_units ¤

set_channel_units(
    channels_to_units: UnitMapping,
    validate_schema: bool = False,
    allow_display_only_units: bool = False,
) -> None

Set units for channels based on a provided mapping of channel names to units.


channels_to_units: A mapping of channel names to unit symbols.
    NOTE: any existing units may be cleared from a channel by providing None as a symbol.
validate_schema: If true, raises a ValueError if non-existent channel names are provided in
    `channels_to_units`. Default is False.
allow_display_only_units: If true, allow units that would be treated as display-only by Nominal.

ValueError: Unsupported unit symbol provided
conjure_python_client.ConjureHTTPError: Error completing requests.

unarchive ¤

unarchive() -> None

Unarchive this connection, making it visible in the UI.

DataReview dataclass ¤

DataReview(
    rid: str,
    run_rid: str,
    checklist_rid: str,
    checklist_commit: str,
    completed: bool,
    _clients: _Clients,
)

Bases: HasRid

nominal_url property ¤

nominal_url: str

Returns a link to the page for this Data Review in the Nominal app

archive ¤

archive() -> None

Archive this data review. Archived data reviews are not deleted, but are hidden from the UI.

NOTE: currently, it is not possible (yet) to unarchive a data review once archived.

get_violations ¤

get_violations() -> Sequence[CheckViolation]

Retrieves the list of check violations for the data review.

poll_for_completion ¤

poll_for_completion(
    interval: timedelta = timedelta(seconds=2),
) -> DataReview

Polls the data review until it is completed.

reload ¤

reload() -> DataReview

Reloads the data review from the server.

DataReviewBuilder dataclass ¤

DataReviewBuilder(
    _integration_rids: list[str],
    _requests: list[CreateDataReviewRequest],
    _clients: _Clients,
)

initiate ¤

Initiates a batch data review process.

Parameters:

  • wait_for_completion ¤

    (bool, default: True ) –

    If True, waits for the data review process to complete before returning. Default is True.

Dataset dataclass ¤

Dataset(
    rid: str,
    _clients: _Clients,
    name: str,
    description: str | None,
    properties: Mapping[str, str],
    labels: Sequence[str],
    bounds: DatasetBounds | None,
)

Bases: DataSource

nominal_url property ¤

nominal_url: str

Returns a URL to the page in the nominal app containing this dataset

add_ardupilot_dataflash ¤

add_ardupilot_dataflash(path: Path | str) -> None

Add a Dataflash file to an existing dataset.

add_from_io ¤

add_from_io(
    dataset: BinaryIO,
    timestamp_column: str,
    timestamp_type: _AnyTimestampType,
    file_type: tuple[str, str] | FileType = CSV,
    file_name: str | None = None,
    tag_columns: Mapping[str, str] | None = None,
    tags: Mapping[str, str] | None = None,
) -> None

Append to a dataset from a file-like object.

Parameters:

  • dataset ¤

    (BinaryIO) –

    a file-like object containing the data to append to the dataset.

  • timestamp_column ¤

    (str) –

    the column in the dataset that contains the timestamp data.

  • timestamp_type ¤

    (_AnyTimestampType) –

    the type of timestamp data in the dataset.

  • file_type ¤

    (tuple[str, str] | FileType, default: CSV ) –

    a (extension, mimetype) pair describing the type of file.

  • file_name ¤

    (str | None, default: None ) –

    the name of the file to upload.

  • tag_columns ¤

    (Mapping[str, str] | None, default: None ) –

    a dictionary mapping tag keys to column names.

  • tags ¤

    (Mapping[str, str] | None, default: None ) –

    key-value pairs to apply as tags to all data uniformly in the file

add_journal_json ¤

add_journal_json(path: Path | str) -> None

Add a journald jsonl file to an existing dataset.

add_mcap ¤

add_mcap(
    path: Path | str,
    include_topics: Iterable[str] | None = None,
    exclude_topics: Iterable[str] | None = None,
) -> None

Add an MCAP file to an existing dataset.


path: Path to the MCAP file to add to this dataset
include_topics: If present, list of topics to restrict ingestion to.
    If not present, defaults to all protobuf-encoded topics present in the MCAP.
exclude_topics: If present, list of topics to not ingest from the MCAP.

add_mcap_from_io ¤

add_mcap_from_io(
    mcap: BinaryIO,
    include_topics: Iterable[str] | None = None,
    exclude_topics: Iterable[str] | None = None,
    file_name: str | None = None,
) -> None

Add data to this dataset from an MCAP file-like object.

The mcap must be a file-like object in binary mode, e.g. open(path, "rb") or io.BytesIO. If the file is not in binary-mode, the requests library blocks indefinitely.


mcap: Binary file-like MCAP stream
include_topics: If present, list of topics to restrict ingestion to.
    If not present, defaults to all protobuf-encoded topics present in the MCAP.
exclude_topics: If present, list of topics to not ingest from the MCAP.
file_name: If present, name to use when uploading file. Otherwise, defaults to dataset name.

add_tabular_data ¤

add_tabular_data(
    path: Path | str,
    timestamp_column: str,
    timestamp_type: _AnyTimestampType,
    tag_columns: Mapping[str, str] | None = None,
    tags: Mapping[str, str] | None = None,
) -> None

Append to a dataset from tabular data on-disk.

Currently, the supported filetypes are: - .csv / .csv.gz - .parquet / .parquet.gz - .parquet.tar / .parquet.tar.gz / .parquet.zip

Parameters:

  • path ¤

    (Path | str) –

    Path to the file on disk to add to the dataset.

  • timestamp_column ¤

    (str) –

    Column within the file containing timestamp information. NOTE: this is omitted as a channel from the data added to Nominal, and is instead used to set the timestamps for all other uploaded data channels.

  • timestamp_type ¤

    (_AnyTimestampType) –

    Type of timestamp data contained within the timestamp_column e.g. 'epoch_seconds'.

  • tag_columns ¤

    (Mapping[str, str] | None, default: None ) –

    a dictionary mapping tag keys to column names.

  • tags ¤

    (Mapping[str, str] | None, default: None ) –

    key-value pairs to apply as tags to all data uniformly in the file

archive ¤

archive() -> None

Archive this dataset. Archived datasets are not deleted, but are hidden from the UI.

get_channels ¤

get_channels(
    exact_match: Sequence[str] = (),
    fuzzy_search_text: str = "",
    *,
    names: Iterable[str] | None = None
) -> Iterable[Channel]

Look up the metadata for all matching channels associated with this dataset.


exact_match: Filter the returned channels to those whose names match all provided strings
    (case insensitive).
    For example, a channel named 'engine_turbine_rpm' would match against ['engine', 'turbine', 'rpm'],
    whereas a channel named 'engine_turbine_flowrate' would not!
fuzzy_search_text: Filters the returned channels to those whose names fuzzily match the provided string.
names: List of channel names to look up metadata for. This parameter is preferred over
    exact_match and fuzzy_search_text, which are deprecated.

Yields a sequence of channel metadata objects which match the provided query parameters

get_write_stream ¤

get_write_stream(
    batch_size: int = 50000,
    max_wait: timedelta = timedelta(seconds=1),
    data_format: Literal["json", "protobuf", "experimental"] = "json",
) -> WriteStreamBase

Stream to write messages to a datasource.

Messages are written asynchronously.


batch_size: How big the batch can get before writing to Nominal.
max_wait: How long a batch can exist before being flushed to Nominal.
data_format: Serialized data format to use during upload.
    NOTE: selecting 'protobuf' or 'experimental' requires that `nominal` was installed
          with `protos` extras.

Write stream object configured to send data to nominal. This may be used as a context manager
(so that resources are automatically released upon exiting the context), or if not used as a context
manager, should be explicitly `close()`-ed once no longer needed.

list_files ¤

list_files(*, successful_only: bool = True) -> Iterable[DatasetFile]

List files ingested to this dataset.

If successful_only, yields files with a 'success' ingest status only.

poll_until_ingestion_completed ¤

poll_until_ingestion_completed(
    interval: timedelta = timedelta(seconds=1),
) -> Self

Block until dataset file ingestion has completed. This method polls Nominal for ingest status after uploading a file to a dataset on an interval.


NominalIngestFailed: if the ingest failed
NominalIngestError: if the ingest status is not known

search_channels ¤

search_channels(
    exact_match: Sequence[str] = (), fuzzy_search_text: str = ""
) -> Iterable[Channel]

Look up channels associated with a datasource.

Parameters:

  • exact_match ¤

    (Sequence[str], default: () ) –

    Filter the returned channels to those whose names match all provided strings (case insensitive).

  • fuzzy_search_text ¤

    (str, default: '' ) –

    Filters the returned channels to those whose names fuzzily match the provided string.

Yields:

set_channel_prefix_tree ¤

set_channel_prefix_tree(delimiter: str = '.') -> None

Index channels hierarchically by a given delimiter.

Primarily, the result of this operation is to prompt the frontend to represent channels in a tree-like manner that allows folding channels by common roots.

set_channel_units ¤

set_channel_units(
    channels_to_units: UnitMapping,
    validate_schema: bool = False,
    allow_display_only_units: bool = False,
) -> None

Set units for channels based on a provided mapping of channel names to units.


channels_to_units: A mapping of channel names to unit symbols.
    NOTE: any existing units may be cleared from a channel by providing None as a symbol.
validate_schema: If true, raises a ValueError if non-existent channel names are provided in
    `channels_to_units`. Default is False.
allow_display_only_units: If true, allow units that would be treated as display-only by Nominal.

ValueError: Unsupported unit symbol provided
conjure_python_client.ConjureHTTPError: Error completing requests.

unarchive ¤

unarchive() -> None

Unarchives this dataset, allowing it to show up in the 'All Datasets' pane in the UI.

update ¤

update(
    *,
    name: str | None = None,
    description: str | None = None,
    properties: Mapping[str, str] | None = None,
    labels: Sequence[str] | None = None
) -> Self

Replace dataset metadata. Updates the current instance, and returns it.

Only the metadata passed in will be replaced, the rest will remain untouched.

Note: This replaces the metadata rather than appending it. To append to labels or properties, merge them before calling this method. E.g.:

new_labels = ["new-label-a", "new-label-b"]
for old_label in dataset.labels:
    new_labels.append(old_label)
dataset = dataset.update(labels=new_labels)

write_logs ¤

write_logs(
    logs: Iterable[LogPoint],
    channel_name: str = "logs",
    batch_size: int = 1000,
) -> None

Stream logs to the datasource.

This method executes synchronously, i.e. it blocks until all logs are sent to the API. Logs are sent in batches. The logs can be any iterable of LogPoints, including a generator.

Parameters:

  • logs ¤

    (Iterable[LogPoint]) –

    LogPoints to stream to Nominal.

  • channel_name ¤

    (str, default: 'logs' ) –

    Name of the channel to stream logs to.

  • batch_size ¤

    (int, default: 1000 ) –

    Number of logs to send to the API at a time.

Example
from nominal.core import LogPoint

def parse_logs_from_file(file_path: str) -> Iterable[LogPoint]:
    # 2025-04-08T14:26:28.679052Z [INFO] Sent ACTUATE_MOTOR command
    with open(file_path, "r") as f:
        for line in f:
            timestamp, message = line.removesuffix("\n").split(maxsplit=1)
            yield LogPoint.create(timestamp, message)

dataset = client.get_dataset("dataset_rid")
logs = parse_logs_from_file("logs.txt")
dataset.write_logs(logs)

Event dataclass ¤

Event(
    uuid: str,
    asset_rids: Sequence[str],
    name: str,
    start: IntegralNanosecondsUTC,
    duration: IntegralNanosecondsDuration,
    properties: Mapping[str, str],
    type: EventType,
    created_by_rid: str | None,
    _clients: _Clients,
)

update ¤

update(
    *,
    name: str | None = None,
    assets: Iterable[Asset | str] | None = None,
    start: datetime | IntegralNanosecondsUTC | None = None,
    duration: timedelta | IntegralNanosecondsDuration | None = None,
    properties: Mapping[str, str] | None = None,
    labels: Iterable[str] | None = None,
    type: EventType | None
) -> Self

Replace event metadata. Updates the current instance, and returns it. Only the metadata passed in will be replaced, the rest will remain untouched.

Note: This replaces the metadata rather than appending it. To append to labels or properties, merge them before calling this method. E.g.:

new_labels = ["new-label-a", "new-label-b"]
for old_label in event.labels:
    new_labels.append(old_label)
event = event.update(labels=new_labels)

Log dataclass ¤

Log(timestamp: IntegralNanosecondsUTC, body: str)

A single log in a LogSet. LogSets are deprecated.

LogPoint dataclass ¤

LogPoint(
    timestamp: IntegralNanosecondsUTC,
    message: str,
    args: Mapping[str, str],
)

LogPoint is a single, timestamped log entry.

LogPoints are added to a Dataset using Dataset.write_logs.

LogSet dataclass ¤

LogSet(
    rid: str,
    name: str,
    timestamp_type: LogTimestampType,
    description: str | None,
    _clients: _Clients,
)

Bases: HasRid

LogSet is a collection of logs. LogSets are deprecated.

stream_logs ¤

stream_logs() -> Iterable[Log]

Iterate over the logs.

NominalClient dataclass ¤

NominalClient(_clients: ClientsBunch, _profile: str | None = None)

create classmethod ¤

create(
    base_url: str,
    token: str | None,
    trust_store_path: str | None = None,
    connect_timeout: timedelta | float = DEFAULT_CONNECT_TIMEOUT,
    *,
    workspace_rid: str | None = None
) -> Self

Create a connection to the Nominal platform.

base_url: The URL of the Nominal API platform, e.g. "https://api.gov.nominal.io/api". token: An API token to authenticate with. If None, the token will be looked up in ~/.nominal.yml. trust_store_path: path to a trust store CA root file to initiate SSL connections. If not provided, certifi's trust store is used. connect_timeout: Timeout for any single request to the Nominal API. workspace_rid: The workspace RID to use for all API calls that require it. If not provided, the default workspace will be used (if one is configured for the tenant).

create_asset ¤

create_asset(
    name: str,
    description: str | None = None,
    *,
    properties: Mapping[str, str] | None = None,
    labels: Sequence[str] = ()
) -> Asset

Create an asset.

create_attachment_from_io ¤

create_attachment_from_io(
    attachment: BinaryIO,
    name: str,
    file_type: tuple[str, str] | FileType = BINARY,
    description: str | None = None,
    *,
    properties: Mapping[str, str] | None = None,
    labels: Sequence[str] = ()
) -> Attachment

Upload an attachment. The attachment must be a file-like object in binary mode, e.g. open(path, "rb") or io.BytesIO. If the file is not in binary-mode, the requests library blocks indefinitely.

create_dataset ¤

create_dataset(
    name: str,
    *,
    description: str | None = None,
    labels: Sequence[str] = (),
    properties: Mapping[str, str] | None = None,
    prefix_tree_delimiter: str | None = None
) -> Dataset

Create an empty dataset.

Parameters:

  • name ¤

    (str) –

    Name of the dataset to create in Nominal.

  • description ¤

    (str | None, default: None ) –

    Human readable description of the dataset.

  • labels ¤

    (Sequence[str], default: () ) –

    Text labels to apply to the created dataset

  • properties ¤

    (Mapping[str, str] | None, default: None ) –

    Key-value properties to apply to the cleated dataset

  • prefix_tree_delimiter ¤

    (str | None, default: None ) –

    If present, the delimiter to represent tiers when viewing channels hierarchically.

Returns:

  • Dataset

    Reference to the created dataset in Nominal.

create_empty_video ¤

create_empty_video(
    name: str,
    *,
    description: str | None = None,
    labels: Sequence[str] = (),
    properties: Mapping[str, str] | None = None
) -> Video

Create an empty video to append video files to.

Parameters:

  • name ¤

    (str) –

    Name of the video to create in Nominal

  • description ¤

    (str | None, default: None ) –

    Description of the video to create in nominal

  • labels ¤

    (Sequence[str], default: () ) –

    Labels to apply to the video in nominal

  • properties ¤

    (Mapping[str, str] | None, default: None ) –

    Properties to apply to the video in nominal

Returns:

  • Video

    Handle to the created video

create_log_set ¤

create_log_set(
    name: str,
    logs: (
        Iterable[Log]
        | Iterable[tuple[datetime | IntegralNanosecondsUTC, str]]
    ),
    timestamp_type: LogTimestampType = "absolute",
    description: str | None = None,
) -> LogSet

Create an immutable log set with the given logs.

The logs are attached during creation and cannot be modified afterwards. Logs can either be of type Log or a tuple of a timestamp and a string. Timestamp type must be either 'absolute' or 'relative'.

create_run ¤

create_run(
    name: str,
    start: datetime | IntegralNanosecondsUTC,
    end: datetime | IntegralNanosecondsUTC | None,
    description: str | None = None,
    *,
    properties: Mapping[str, str] | None = None,
    labels: Sequence[str] = (),
    attachments: Iterable[Attachment] | Iterable[str] = (),
    asset: Asset | str | None = None
) -> Run

Create a run.

create_secret ¤

create_secret(
    name: str,
    decrypted_value: str,
    description: str | None = None,
    labels: Sequence[str] = (),
    properties: Mapping[str, str] | None = None,
) -> Secret

Create a secret for the current user

Parameters:

  • name ¤

    (str) –

    Name of the secret

  • decrypted_value ¤

    (str) –

    Plain text value of the secret

  • description ¤

    (str | None, default: None ) –

    Description of the secret

  • labels ¤

    (Sequence[str], default: () ) –

    Labels for the secret

  • properties ¤

    (Mapping[str, str] | None, default: None ) –

    Properties for the secret

create_video_from_mcap ¤

create_video_from_mcap(
    path: Path | str,
    topic: str,
    name: str | None = None,
    description: str | None = None,
    *,
    labels: Sequence[str] = (),
    properties: Mapping[str, str] | None = None
) -> Video

Create a video from an MCAP file containing H264 or H265 video data.

If name is None, the name of the file will be used.

See create_video_from_mcap_io for more details.

create_video_from_mcap_io ¤

create_video_from_mcap_io(
    mcap: BinaryIO,
    topic: str,
    name: str,
    description: str | None = None,
    file_type: tuple[str, str] | FileType = MCAP,
    *,
    labels: Sequence[str] = (),
    properties: Mapping[str, str] | None = None,
    file_name: str | None = None
) -> Video

Create video from topic in a mcap file.

Mcap must be a file-like object in binary mode, e.g. open(path, "rb") or io.BytesIO.

If name is None, the name of the file will be used.

from_profile classmethod ¤

from_profile(
    profile: str,
    *,
    trust_store_path: str | None = None,
    connect_timeout: timedelta | float = DEFAULT_CONNECT_TIMEOUT
) -> Self

Create a connection to the Nominal platform from a named profile in the Nominal config.

Parameters:

  • profile ¤

    (str) –

    profile name in the Nominal config.

  • trust_store_path ¤

    (str | None, default: None ) –

    path to a trust store certificate chain to initiate SSL connections. If not provided, certifi's trust store is used.

  • connect_timeout ¤

    (timedelta | float, default: DEFAULT_CONNECT_TIMEOUT ) –

    Request connection timeout.

from_token classmethod ¤

from_token(
    token: str,
    base_url: str = DEFAULT_API_BASE_URL,
    *,
    workspace_rid: str | None = None,
    trust_store_path: str | None = None,
    connect_timeout: timedelta | float = DEFAULT_CONNECT_TIMEOUT,
    _profile: str | None = None
) -> Self

Create a connection to the Nominal platform from a token.

Parameters:

  • token ¤

    (str) –

    Authentication token to use for the connection.

  • base_url ¤

    (str, default: DEFAULT_API_BASE_URL ) –

    The URL of the Nominal API platform.

  • workspace_rid ¤

    (str | None, default: None ) –

    The workspace RID to use for all API calls that require it. If not provided, the default workspace will be used (if one is configured for the tenant).

  • trust_store_path ¤

    (str | None, default: None ) –

    path to a trust store certificate chain to initiate SSL connections. If not provided, certifi's trust store is used.

  • connect_timeout ¤

    (timedelta | float, default: DEFAULT_CONNECT_TIMEOUT ) –

    Request connection timeout.

get_all_units ¤

get_all_units() -> Sequence[Unit]

Retrieve list of metadata for all supported units within Nominal

get_asset ¤

get_asset(rid: str) -> Asset

Retrieve an asset by its RID.

get_attachment ¤

get_attachment(rid: str) -> Attachment

Retrieve an attachment by its RID.

get_attachments ¤

get_attachments(rids: Iterable[str]) -> Sequence[Attachment]

Retrive attachments by their RIDs.

get_channel ¤

get_channel(rid: str) -> Channel

Get metadata for a given channel by looking up its rid Args: rid: Identifier for the channel to look up Returns: Resolved metadata for the requested channel Raises: conjure_python_client.ConjureHTTPError: An error occurred while looking up the channel. This typically occurs when there is no such channel for the given RID.

get_commensurable_units ¤

get_commensurable_units(unit_symbol: str) -> Sequence[Unit]

Get the list of units that are commensurable (convertible to/from) the given unit symbol.

get_connection ¤

get_connection(rid: str) -> Connection

Retrieve a connection by its RID.

get_dataset ¤

get_dataset(rid: str) -> Dataset

Retrieve a dataset by its RID.

get_datasets ¤

get_datasets(rids: Iterable[str]) -> Sequence[Dataset]

Retrieve datasets by their RIDs.

get_log_set ¤

get_log_set(log_set_rid: str) -> LogSet

Retrieve a log set along with its metadata given its RID.

get_run ¤

get_run(rid: str) -> Run

Retrieve a run by its RID.

get_secret ¤

get_secret(rid: str) -> Secret

Retrieve a secret by RID.

get_unit ¤

get_unit(unit_symbol: str) -> Unit | None

Get details of the given unit symbol, or none if the symbol is not recognized by Nominal.

Parameters:

  • unit_symbol ¤

    (str) –

    Symbol of the unit to get metadata for. NOTE: This currently requires that units are formatted as laid out in the latest UCUM standards (see https://ucum.org/ucum)


Resolved unit metadata if the symbol is valid and supported by Nominal, or None
if no such unit symbol matches.

get_user ¤

get_user(user_rid: str | None = None) -> User

Retrieve the specified user.

Parameters:

  • user_rid ¤

    (str | None, default: None ) –

    Rid of the user to retrieve

Returns:

  • User

    Details on the requested user, or the current user if no user rid is provided.

get_video ¤

get_video(rid: str) -> Video

Retrieve a video by its RID.

get_videos ¤

get_videos(rids: Iterable[str]) -> Sequence[Video]

Retrieve videos by their RID.

get_workspace ¤

get_workspace(workspace_rid: str | None = None) -> Workspace

Get workspace via given RID, or the default workspace if no RID is provided.

Parameters:

  • workspace_rid ¤

    (str | None, default: None ) –

    If provided, the RID of the workspace to retrieve. If None, retrieves the default workspace.

Returns:

  • Workspace

    Returns details about the requested workspace.

Raises:

  • RuntimeError

    Raises a RuntimeError if a workspace is not provided, but there is no configured default workspace for the current user.

list_streaming_checklists ¤

list_streaming_checklists(
    asset: Asset | str | None = None,
) -> Iterable[str]

List all Streaming Checklists.

Parameters:

  • asset ¤

    (Asset | str | None, default: None ) –

    if provided, only return checklists associated with the given asset.

list_workspaces ¤

list_workspaces() -> Sequence[Workspace]

Return all workspaces visible to the current user

search_assets ¤

search_assets(
    search_text: str | None = None,
    *,
    labels: Sequence[str] | None = None,
    properties: Mapping[str, str] | None = None
) -> Sequence[Asset]

Search for assets meeting the specified filters. Filters are ANDed together, e.g. (asset.label == label) AND (asset.search_text =~ field)

Parameters:

  • search_text ¤

    (str | None, default: None ) –

    case-insensitive search for any of the keywords in all string fields

  • labels ¤

    (Sequence[str] | None, default: None ) –

    A sequence of labels that must ALL be present on a asset to be included.

  • properties ¤

    (Mapping[str, str] | None, default: None ) –

    A mapping of key-value pairs that must ALL be present on a asset to be included.

Returns:

  • Sequence[Asset]

    All assets which match all of the provided conditions

search_checklists ¤

search_checklists(
    search_text: str | None = None,
    labels: Sequence[str] | None = None,
    properties: Mapping[str, str] | None = None,
) -> Sequence[Checklist]

Search for checklists meeting the specified filters. Filters are ANDed together, e.g. (checklist.label == label) AND (checklist.search_text =~ field)

Parameters:

  • search_text ¤

    (str | None, default: None ) –

    case-insensitive search for any of the keywords in all string fields

  • labels ¤

    (Sequence[str] | None, default: None ) –

    A sequence of labels that must ALL be present on a checklist to be included.

  • properties ¤

    (Mapping[str, str] | None, default: None ) –

    A mapping of key-value pairs that must ALL be present on a checklist to be included.

Returns:

search_data_reviews ¤

search_data_reviews(
    assets: Sequence[Asset | str] | None = None,
    runs: Sequence[Run | str] | None = None,
) -> Sequence[DataReview]

Search for any data reviews present within a collection of runs and assets.

search_events ¤

search_events(
    *,
    search_text: str | None = None,
    after: datetime | IntegralNanosecondsUTC | None = None,
    before: datetime | IntegralNanosecondsUTC | None = None,
    assets: Iterable[Asset | str] | None = None,
    labels: Iterable[str] | None = None,
    properties: Mapping[str, str] | None = None,
    created_by: User | str | None = None
) -> Sequence[Event]

Search for events meeting the specified filters. Filters are ANDed together, e.g. (event.label == label) AND (event.start > before)

Parameters:

  • search_text ¤

    (str | None, default: None ) –

    Searches for a string in the event's metadata.

  • after ¤

    (datetime | IntegralNanosecondsUTC | None, default: None ) –

    Filters to end times after this time, exclusive.

  • before ¤

    (datetime | IntegralNanosecondsUTC | None, default: None ) –

    Filters to start times before this time, exclusive.

  • assets ¤

    (Iterable[Asset | str] | None, default: None ) –

    List of assets that must ALL be present on an event to be included.

  • labels ¤

    (Iterable[str] | None, default: None ) –

    A list of labels that must ALL be present on an event to be included.

  • properties ¤

    (Mapping[str, str] | None, default: None ) –

    A mapping of key-value pairs that must ALL be present on an event to be included.

  • created_by ¤

    (User | str | None, default: None ) –

    A User (or rid) of the author that must be present on an event to be included.

Returns:

  • Sequence[Event]

    All events which match all of the provided conditions

search_runs ¤

search_runs(
    start: str | datetime | IntegralNanosecondsUTC | None = None,
    end: str | datetime | IntegralNanosecondsUTC | None = None,
    name_substring: str | None = None,
    *,
    labels: Sequence[str] | None = None,
    properties: Mapping[str, str] | None = None
) -> Sequence[Run]

Search for runs meeting the specified filters. Filters are ANDed together, e.g. (run.label == label) AND (run.end <= end)

Parameters:

  • start ¤

    (str | datetime | IntegralNanosecondsUTC | None, default: None ) –

    Inclusive start time for filtering runs.

  • end ¤

    (str | datetime | IntegralNanosecondsUTC | None, default: None ) –

    Inclusive end time for filtering runs.

  • name_substring ¤

    (str | None, default: None ) –

    Searches for a (case-insensitive) substring in the name.

  • labels ¤

    (Sequence[str] | None, default: None ) –

    A sequence of labels that must ALL be present on a run to be included.

  • properties ¤

    (Mapping[str, str] | None, default: None ) –

    A mapping of key-value pairs that must ALL be present on a run to be included.

Returns:

  • Sequence[Run]

    All runs which match all of the provided conditions

search_secrets ¤

search_secrets(
    search_text: str | None = None,
    labels: Sequence[str] | None = None,
    properties: Mapping[str, str] | None = None,
) -> Sequence[Secret]

Search for secrets meeting the specified filters. Filters are ANDed together, e.g. (secret.label == label) AND (secret.property == property)

Parameters:

  • search_text ¤

    (str | None, default: None ) –

    Searches for a (case-insensitive) substring across all text fields.

  • labels ¤

    (Sequence[str] | None, default: None ) –

    A sequence of labels that must ALL be present on a secret to be included.

  • properties ¤

    (Mapping[str, str] | None, default: None ) –

    A mapping of key-value pairs that must ALL be present on a secret to be included.

Returns:

  • Sequence[Secret]

    All secrets which match all of the provided conditions

search_users ¤

search_users(
    exact_match: str | None = None, search_text: str | None = None
) -> Sequence[User]

Search for users meeting the specified filters. Filters are ANDed together, e.g., if exact_match and search_text are both provided, then both must match.

Parameters:

  • exact_match ¤

    (str | None, default: None ) –

    Searches for an exact substring across display name and email

  • search_text ¤

    (str | None, default: None ) –

    Searches for a (case-insensitive) substring across display name and email

Returns:

  • Sequence[User]

    All users which match all of the provided conditions

set_channel_units ¤

set_channel_units(rids_to_types: UnitMapping) -> Iterable[Channel]

Sets the units for a set of channels based on user-provided unit symbols Args: rids_to_types: Mapping of channel RIDs -> unit symbols (e.g. 'm/s'). NOTE: Providing None as the unit symbol clears any existing units for the channels.


A sequence of metadata for all updated channels

Raises: conjure_python_client.ConjureHTTPError: An error occurred while setting metadata on the channel. This typically occurs when either the units are invalid, or there are no channels with the given RIDs present.

Run dataclass ¤

Run(
    rid: str,
    name: str,
    description: str,
    properties: Mapping[str, str],
    labels: Sequence[str],
    start: IntegralNanosecondsUTC,
    end: IntegralNanosecondsUTC | None,
    run_number: int,
    assets: Sequence[str],
    _clients: _Clients,
)

Bases: HasRid

nominal_url property ¤

nominal_url: str

Returns a link to the page for this Run in the Nominal app

add_attachments ¤

add_attachments(
    attachments: Iterable[Attachment] | Iterable[str],
) -> None

Add attachments that have already been uploaded to this run.

attachments can be Attachment instances, or attachment RIDs.

add_connection ¤

add_connection(
    ref_name: str,
    connection: Connection | str,
    *,
    series_tags: Mapping[str, str] | None = None,
    offset: timedelta | IntegralNanosecondsDuration | None = None
) -> None

Add a connection to this run.

Ref_name maps "ref name" (the name within the run) to a Connection (or connection rid). The same type of connection should use the same ref name across runs, since checklists and templates use ref names to reference connections.

Parameters:

  • ref_name ¤

    (str) –

    Logical name for the connection to add to the run

  • connection ¤

    (Connection | str) –

    Connection to add to the run

  • series_tags ¤

    (Mapping[str, str] | None, default: None ) –

    Key-value tags to pre-filter the connection with before adding to the run.

  • offset ¤

    (timedelta | IntegralNanosecondsDuration | None, default: None ) –

    Add the connection to the run with a pre-baked offset

add_dataset ¤

add_dataset(
    ref_name: str,
    dataset: Dataset | str,
    *,
    series_tags: Mapping[str, str] | None = None,
    offset: timedelta | IntegralNanosecondsDuration | None = None
) -> None

Add a dataset to this run.

Datasets map "ref names" (their name within the run) to a Dataset (or dataset rid). The same type of datasets should use the same ref name across runs, since checklists and templates use ref names to reference datasets.

Parameters:

  • ref_name ¤

    (str) –

    Logical name for the data scope within the run

  • dataset ¤

    (Dataset | str) –

    Dataset to add to the run

  • series_tags ¤

    (Mapping[str, str] | None, default: None ) –

    Key-value tags to pre-filter the dataset with before adding to the run.

  • offset ¤

    (timedelta | IntegralNanosecondsDuration | None, default: None ) –

    Add the dataset to the run with a pre-baked offset

add_datasets ¤

add_datasets(
    datasets: Mapping[str, Dataset | str],
    *,
    series_tags: Mapping[str, str] | None = None,
    offset: timedelta | IntegralNanosecondsDuration | None = None
) -> None

Add multiple datasets to this run.

Datasets map "ref names" (their name within the run) to a Dataset (or dataset rid). The same type of datasets should use the same ref name across runs, since checklists and templates use ref names to reference datasets.

Parameters:

add_log_set ¤

add_log_set(ref_name: str, log_set: LogSet | str) -> None

Add a log set to this run.

Log sets map "ref names" (their name within the run) to a Log set (or log set rid).

add_log_sets ¤

add_log_sets(log_sets: Mapping[str, LogSet | str]) -> None

Add multiple log sets to this run.

Log sets map "ref names" (their name within the run) to a Log set (or log set rid).

add_video ¤

add_video(ref_name: str, video: Video | str) -> None

Add a video to a run via video object or RID.

archive ¤

archive() -> None

Archive this run. Archived runs are not deleted, but are hidden from the UI.

NOTE: currently, it is not possible (yet) to unarchive a run once archived.

list_assets ¤

list_assets() -> Sequence[Asset]

List assets associated with this run.

list_attachments ¤

list_attachments() -> Sequence[Attachment]

List a sequence of Attachments associated with this Run.

list_connections ¤

list_connections() -> Sequence[tuple[str, Connection]]

List the connections associated with this run. Returns (ref_name, connection) pairs for each connection

list_datasets ¤

list_datasets() -> Sequence[tuple[str, Dataset]]

List the datasets associated with this run. Returns (ref_name, dataset) pairs for each dataset.

list_log_sets ¤

list_log_sets() -> Sequence[tuple[str, LogSet]]

List the log_sets associated with this run. Returns (ref_name, logset) pairs for each logset.

list_videos ¤

list_videos() -> Sequence[tuple[str, Video]]

List a sequence of refname, Video tuples associated with this Run.

remove_attachments ¤

remove_attachments(
    attachments: Iterable[Attachment] | Iterable[str],
) -> None

Remove attachments from this run. Does not remove the attachments from Nominal.

attachments can be Attachment instances, or attachment RIDs.

remove_data_sources ¤

remove_data_sources(
    *,
    ref_names: Sequence[str] | None = None,
    data_sources: (
        Sequence[Connection | Dataset | Video | str] | None
    ) = None
) -> None

Remove data sources from this run.

The list data_sources can contain Connection, Dataset, Video instances, or rids as string.

update ¤

update(
    *,
    name: str | None = None,
    start: datetime | IntegralNanosecondsUTC | None = None,
    end: datetime | IntegralNanosecondsUTC | None = None,
    description: str | None = None,
    properties: Mapping[str, str] | None = None,
    labels: Sequence[str] | None = None,
    links: Sequence[str] | Sequence[Link] | None = None
) -> Self

Replace run metadata. Updates the current instance, and returns it. Only the metadata passed in will be replaced, the rest will remain untouched.

Links can be URLs or tuples of (URL, name).

Note: This replaces the metadata rather than appending it. To append to labels or properties, merge them before calling this method. E.g.:

new_labels = ["new-label-a", "new-label-b"]
for old_label in run.labels:
    new_labels.append(old_label)
run = run.update(labels=new_labels)

Secret dataclass ¤

Secret(
    rid: str,
    name: str,
    description: str,
    properties: Mapping[str, str],
    labels: Sequence[str],
    _clients: _Clients,
)

Bases: HasRid

archive ¤

archive() -> None

Archive the secret, disallowing it to appear from users.

delete ¤

delete() -> None

Permanently delete the secret, removing it from the database entirely.

unarchive ¤

unarchive() -> None

Unarchive the secret, allowing it to appear to users.

update ¤

update(
    *,
    name: str | None = None,
    description: str | None = None,
    properties: Mapping[str, str] | None = None,
    labels: Sequence[str] | None = None
) -> Self

Update the secret in-place.

Parameters:

  • name ¤

    (str | None, default: None ) –

    New name of the secret

  • description ¤

    (str | None, default: None ) –

    New name of the secret

  • properties ¤

    (Mapping[str, str] | None, default: None ) –

    New properties for the secret

  • labels ¤

    (Sequence[str] | None, default: None ) –

    New labels for the secret

Returns:

  • Self

    Updated secret metadata.

Video dataclass ¤

Video(
    rid: str,
    name: str,
    description: str | None,
    properties: Mapping[str, str],
    labels: Sequence[str],
    _clients: _Clients,
)

Bases: HasRid

add_file ¤

add_file(
    path: Path | str,
    start: datetime | IntegralNanosecondsUTC | None = None,
    frame_timestamps: Sequence[IntegralNanosecondsUTC] | None = None,
    description: str | None = None,
) -> VideoFile

Append to a video from a file-path to H264-encoded video data.

Parameters:

  • path ¤

    (Path | str) –

    Path to the video file to add to an existing video within Nominal

  • start ¤

    (datetime | IntegralNanosecondsUTC | None, default: None ) –

    Starting timestamp of the video file in absolute UTC time

  • frame_timestamps ¤

    (Sequence[IntegralNanosecondsUTC] | None, default: None ) –

    Per-frame absolute nanosecond timestamps. Most usecases should instead use the 'start' parameter, unless precise per-frame metadata is available and desired.

  • description ¤

    (str | None, default: None ) –

    Description of the video file. NOTE: this is currently not displayed to users and may be removed in the future.

Returns:

  • VideoFile

    Reference to the created video file.

add_from_io ¤

add_from_io(
    video: BinaryIO,
    name: str,
    start: datetime | IntegralNanosecondsUTC | None = None,
    frame_timestamps: Sequence[IntegralNanosecondsUTC] | None = None,
    description: str | None = None,
    file_type: tuple[str, str] | FileType = MP4,
) -> VideoFile

Append to a video from a file-like object containing video data encoded in H264 or H265.

Parameters:

  • video ¤

    (BinaryIO) –

    File-like object containing video data encoded in H264 or H265.

  • name ¤

    (str) –

    Name of the file to use when uploading to S3.

  • start ¤

    (datetime | IntegralNanosecondsUTC | None, default: None ) –

    Starting timestamp of the video file in absolute UTC time

  • frame_timestamps ¤

    (Sequence[IntegralNanosecondsUTC] | None, default: None ) –

    Per-frame absolute nanosecond timestamps. Most usecases should instead use the 'start' parameter, unless precise per-frame metadata is available and desired.

  • description ¤

    (str | None, default: None ) –

    Description of the video file. NOTE: this is currently not displayed to users and may be removed in the future.

  • file_type ¤

    (tuple[str, str] | FileType, default: MP4 ) –

    Metadata about the type of video file, e.g., MP4 vs. MKV.

Returns:

  • VideoFile

    Reference to the created video file.

add_mcap ¤

add_mcap(
    path: Path, topic: str, description: str | None = None
) -> VideoFile

Append to a video from a file-path to an MCAP file containing video data.

Parameters:

  • path ¤

    (Path) –

    Path to the video file to add to an existing video within Nominal

  • topic ¤

    (str) –

    Topic pointing to video data within the MCAP file.

  • description ¤

    (str | None, default: None ) –

    Description of the video file. NOTE: this is currently not displayed to users and may be removed in the future.

Returns:

  • VideoFile

    Reference to the created video file.

add_mcap_from_io ¤

add_mcap_from_io(
    mcap: BinaryIO,
    name: str,
    topic: str,
    description: str | None = None,
    file_type: tuple[str, str] | FileType = MCAP,
) -> VideoFile

Append to a video from a file-like binary stream with MCAP data containing video data.

Parameters:

  • mcap ¤

    (BinaryIO) –

    File-like binary object containing MCAP data to upload.

  • name ¤

    (str) –

    Name of the file to create in S3 during upload

  • topic ¤

    (str) –

    Topic pointing to video data within the MCAP file.

  • description ¤

    (str | None, default: None ) –

    Description of the video file. NOTE: this is currently not displayed to users and may be removed in the future.

  • file_type ¤

    (tuple[str, str] | FileType, default: MCAP ) –

    Metadata about the type of video (e.g. MCAP).

Returns:

  • VideoFile

    Reference to the created video file.

archive ¤

archive() -> None

Archive this video. Archived videos are not deleted, but are hidden from the UI.

list_files ¤

list_files() -> Sequence[VideoFile]

List all video files associated with the video.

poll_until_ingestion_completed ¤

poll_until_ingestion_completed(
    interval: timedelta = timedelta(seconds=1),
) -> None

Block until video ingestion has completed. This method polls Nominal for ingest status after uploading a video on an interval.


NominalIngestFailed: if the ingest failed
NominalIngestError: if the ingest status is not known

unarchive ¤

unarchive() -> None

Unarchives this video, allowing it to show up in the 'All Videos' pane in the UI.

update ¤

update(
    *,
    name: str | None = None,
    description: str | None = None,
    properties: Mapping[str, str] | None = None,
    labels: Sequence[str] | None = None
) -> Self

Replace video metadata. Updates the current instance, and returns it.

Only the metadata passed in will be replaced, the rest will remain untouched.

Note: This replaces the metadata rather than appending it. To append to labels or properties, merge them before calling this method. E.g.:

new_labels = ["new-label-a", "new-label-b"]
for old_label in video.labels:
    new_labels.append(old_label)
video = video.update(labels=new_labels)

Workbook dataclass ¤

Workbook(
    rid: str,
    title: str,
    description: str,
    run_rid: str | None,
    _clients: _Clients,
)

Bases: HasRid

nominal_url property ¤

nominal_url: str

Returns a link to the page for this Workbook in the Nominal app

archive ¤

archive() -> None

Archive this workbook. Archived workbooks are not deleted, but are hidden from the UI.

unarchive ¤

unarchive() -> None

Unarchive this workbook, allowing it to be viewed in the UI.

WriteStream dataclass ¤

WriteStream(
    batch_size: int,
    max_wait: timedelta,
    _process_batch: Callable[[Sequence[BatchItem]], None],
    _executor: ThreadPoolExecutor,
    _thread_safe_batch: ThreadSafeBatch,
    _stop: Event,
    _pending_jobs: BoundedSemaphore,
)

Bases: WriteStreamBase

close ¤

close(wait: bool = True) -> None

Close the Nominal Stream.

Stop the process timeout thread Flush any remaining batches

create classmethod ¤

create(
    batch_size: int,
    max_wait: timedelta,
    process_batch: Callable[[Sequence[BatchItem]], None],
) -> Self

Create the stream.

enqueue ¤

enqueue(
    channel_name: str,
    timestamp: str | datetime | IntegralNanosecondsUTC,
    value: float | str,
    tags: Mapping[str, str] | None = None,
) -> None

Add a message to the queue after normalizing the timestamp to IntegralNanosecondsUTC.

The message is added to the thread-safe batch and flushed if the batch size is reached.

enqueue_batch ¤

enqueue_batch(
    channel_name: str,
    timestamps: Sequence[str | datetime | IntegralNanosecondsUTC],
    values: Sequence[float | str],
    tags: Mapping[str, str] | None = None,
) -> None

Add a sequence of messages to the queue to upload to Nominal.

Messages are added one-by-one (with timestamp normalization) and flushed based on the batch conditions.

Parameters:

  • channel_name ¤

    (str) –

    Name of the channel to upload data for.

  • timestamps ¤

    (Sequence[str | datetime | IntegralNanosecondsUTC]) –

    Absolute timestamps of the data being uploaded.

  • values ¤

    (Sequence[float | str]) –

    Values to write to the specified channel.

  • tags ¤

    (Mapping[str, str] | None, default: None ) –

    Key-value tags associated with the data being uploaded. NOTE: This must include all required_tags used when creating a Connection to Nominal.

enqueue_from_dict ¤

enqueue_from_dict(
    timestamp: str | datetime | IntegralNanosecondsUTC,
    channel_values: Mapping[str, float | str],
    tags: Mapping[str, str] | None = None,
) -> None

Write multiple channel values at a given timestamp using a flattened dictionary.

Each key in the dictionary is treated as a channel name and the corresponding value is enqueued with the given timestamp.

Parameters:

  • timestamp ¤

    (str | datetime | IntegralNanosecondsUTC) –

    The shared timestamp to use for all items to enqueue.

  • channel_values ¤

    (Mapping[str, float | str]) –

    A dictionary mapping channel names to their respective values.

  • tags ¤

    (Mapping[str, str] | None, default: None ) –

    Key-value tags associated with the data being uploaded. NOTE: This should include all required_tags used when creating a Connection to Nominal.

flush ¤

flush(wait: bool = False, timeout: float | None = None) -> None

Flush current batch of records to nominal in a background thread.


wait: If true, wait for the batch to complete uploading before returning
timeout: If wait is true, the time to wait for flush completion in seconds.
         NOTE: If none, waits indefinitely.

poll_until_ingestion_completed ¤

poll_until_ingestion_completed(
    datasets: Iterable[Dataset],
    interval: timedelta = timedelta(seconds=1),
) -> None

Block until all dataset ingestions have completed (succeeded or failed).

This method polls Nominal for ingest status on each of the datasets on an interval. No specific ordering is guaranteed, but all datasets will be checked at least once.


NominalIngestMultiError: if any of the datasets failed to ingest