Skip to content

warehouse_write

Public callable

Write a Spark DataFrame to a Microsoft Fabric warehouse table.

This uses Fabric Spark's synapsesql connector to write to a warehouse configured in the framework CONFIG mapping. Use this near the end of the Product step when publishing serving tables.

Parameters:

Name Type Description Default
df DataFrame

Spark DataFrame to write.

required
env str

Environment name in the config mapping, for example "Sandbox" or "DE".

required
target str

Warehouse target name under the selected environment, for example "Warehouse" or "wh_Bronze".

required
schema str

Warehouse schema name, for example "dbo".

required
table str

Warehouse table name.

required
mode str

Spark write mode, for example "append" or "overwrite".

"append"
config dict

Config mapping from the config notebook. Expected shape: config[environment][target] = Housepath(...).

None

Returns:

Type Description
None

The DataFrame is written to the target warehouse table.

Notes

Side effect: performs a write operation to the target warehouse object via Fabric runtime connector APIs.

Raises:

Type Description
RuntimeError

If the Microsoft Fabric Spark connector is unavailable.

ValueError

If the selected environment or target is missing from the config.

Examples:

>>> warehouse_write(
...     df,
...     env="EDLH",
...     target="wh_Bronze",
...     schema="dbo",
...     table="Customer",
...     mode="append",
...     config=CONFIG,
... )
Source code in src/fabricops_kit/fabric_io.py
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
def warehouse_write(df, env, target, schema, table, mode="append", config=None):
    """Write a Spark DataFrame to a Microsoft Fabric warehouse table.

    This uses Fabric Spark's `synapsesql` connector to write to a warehouse
    configured in the framework `CONFIG` mapping. Use this near the end of the
    Product step when publishing serving tables.

    Parameters
    ----------
    df : pyspark.sql.DataFrame
        Spark DataFrame to write.
    env : str
        Environment name in the config mapping, for example `"Sandbox"` or `"DE"`.
    target : str
        Warehouse target name under the selected environment, for example
        `"Warehouse"` or `"wh_Bronze"`.
    schema : str
        Warehouse schema name, for example `"dbo"`.
    table : str
        Warehouse table name.
    mode : str, default "append"
        Spark write mode, for example `"append"` or `"overwrite"`.
    config : dict, optional
        Config mapping from the config notebook. Expected shape:
        `config[environment][target] = Housepath(...)`.

    Returns
    -------
    None
        The DataFrame is written to the target warehouse table.

    Notes
    -----
    Side effect: performs a write operation to the target warehouse object via
    Fabric runtime connector APIs.

    Raises
    ------
    RuntimeError
        If the Microsoft Fabric Spark connector is unavailable.
    ValueError
        If the selected environment or target is missing from the config.

    Examples
    --------
    >>> warehouse_write(
    ...     df,
    ...     env="EDLH",
    ...     target="wh_Bronze",
    ...     schema="dbo",
    ...     table="Customer",
    ...     mode="append",
    ...     config=CONFIG,
    ... )
    """
    p = get_path(env, target, config=config)

    try:
        import com.microsoft.spark.fabric
        from com.microsoft.spark.fabric.Constants import Constants
    except Exception as exc:
        raise RuntimeError(
            "This function must run inside Microsoft Fabric Spark with "
            "com.microsoft.spark.fabric available."
        ) from exc

    (
        df.write.mode(mode)
        .option(Constants.WorkspaceId, p.workspace_id)
        .option(Constants.DatawarehouseId, p.house_id)
        .synapsesql(f"{p.house_name}.{schema}.{table}")
    )