warehouse_write
Write a Spark DataFrame to a Microsoft Fabric warehouse table.
This uses Fabric Spark's synapsesql connector to write to a warehouse
configured in the framework CONFIG mapping. Use this near the end of the
Product step when publishing serving tables.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
df
|
DataFrame
|
Spark DataFrame to write. |
required |
env
|
str
|
Environment name in the config mapping, for example |
required |
target
|
str
|
Warehouse target name under the selected environment, for example
|
required |
schema
|
str
|
Warehouse schema name, for example |
required |
table
|
str
|
Warehouse table name. |
required |
mode
|
str
|
Spark write mode, for example |
"append"
|
config
|
dict
|
Config mapping from the config notebook. Expected shape:
|
None
|
Returns:
| Type | Description |
|---|---|
None
|
The DataFrame is written to the target warehouse table. |
Notes
Side effect: performs a write operation to the target warehouse object via Fabric runtime connector APIs.
Raises:
| Type | Description |
|---|---|
RuntimeError
|
If the Microsoft Fabric Spark connector is unavailable. |
ValueError
|
If the selected environment or target is missing from the config. |
Examples:
>>> warehouse_write(
... df,
... env="EDLH",
... target="wh_Bronze",
... schema="dbo",
... table="Customer",
... mode="append",
... config=CONFIG,
... )
Source code in src/fabricops_kit/fabric_io.py
397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 425 426 427 428 429 430 431 432 433 434 435 436 437 438 439 440 441 442 443 444 445 446 447 448 449 450 451 452 453 454 455 456 457 458 459 460 461 462 463 464 465 466 467 468 | |