Skip to content

write_governance_classifications

Public callable

Persist governance classifications to a metadata destination.

Parameters:

Name Type Description Default
spark Any

Value used by this callable.

required
classifications Any

Value used by this callable.

required
table_name Any

Value used by this callable.

required
dataset_name Any

Value used by this callable.

None
source_table Any

Value used by this callable.

None
run_id Any

Value used by this callable.

None
status Any

Value used by this callable.

'suggested'
generated_by Any

Value used by this callable.

'framework'
mode Any

Value used by this callable.

'append'

Returns:

Type Description
list[dict]

Structured output produced by this callable.

Source code in src/fabricops_kit/governance.py
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
def write_governance_classifications(spark, classifications: list[dict], table_name: str, dataset_name: str | None = None, source_table: str | None = None, run_id: str | None = None, status: str = "suggested", generated_by: str = "framework", mode: str = "append") -> list[dict]:
    """Persist governance classifications to a metadata destination.

        Parameters
        ----------
        spark : Any
            Value used by this callable.
        classifications : Any
            Value used by this callable.
        table_name : Any
            Value used by this callable.
        dataset_name : Any
            Value used by this callable.
        source_table : Any
            Value used by this callable.
        run_id : Any
            Value used by this callable.
        status : Any
            Value used by this callable.
        generated_by : Any
            Value used by this callable.
        mode : Any
            Value used by this callable.

        Returns
        -------
        list[dict]
            Structured output produced by this callable.
    """
    dataset = dataset_name or "unknown"
    source = source_table or table_name
    records = build_governance_classification_records(classifications=classifications, dataset_name=dataset, table_name=source, run_id=run_id, status=status, generated_by=generated_by)

    def _spark_writer(rows, table_identifier, mode="append", **_):
        df = _spark_create_governance_metadata_dataframe(spark, rows)
        if df is None:
            return None
        df.write.mode(mode).saveAsTable(table_identifier)
        return df

    write_metadata_records(records=records, table_identifier=table_name, writer=_spark_writer, mode=mode)
    return records