Skip to content

lakehouse_table_read

Public callable

Read a Delta table from a Fabric lakehouse.

This reads from the lakehouse Tables/ area using the ABFSS root stored in a Housepath. In the notebook lifecycle, call this near the start of the Source or Unified step when loading Delta-backed source datasets.

Parameters:

Name Type Description Default
lh Housepath

Lakehouse path object returned by get_path.

required
tablename str

Name of the table under the lakehouse Tables/ folder.

required
spark_session object

Spark session to use. If omitted, the helper uses the notebook global spark.

None

Returns:

Type Description
DataFrame

Spark DataFrame loaded from the Delta table.

Raises:

Type Description
ValueError

If lh.root or tablename is missing.

RuntimeError

If no Spark session is available.

Examples:

>>> lh_source = get_path("Sandbox", "Source", config=CONFIG)
>>> df = lakehouse_table_read(lh_source, "RAW_ORDERS")
Source code in src/fabricops_kit/fabric_io.py
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
def lakehouse_table_read(lh, tablename, spark_session=None):
    """Read a Delta table from a Fabric lakehouse.

    This reads from the lakehouse `Tables/` area using the ABFSS root stored in
    a `Housepath`. In the notebook lifecycle, call this near the start of the
    Source or Unified step when loading Delta-backed source datasets.

    Parameters
    ----------
    lh : Housepath
        Lakehouse path object returned by `get_path`.
    tablename : str
        Name of the table under the lakehouse `Tables/` folder.
    spark_session : object, optional
        Spark session to use. If omitted, the helper uses the notebook global
        `spark`.

    Returns
    -------
    pyspark.sql.DataFrame
        Spark DataFrame loaded from the Delta table.

    Raises
    ------
    ValueError
        If `lh.root` or `tablename` is missing.
    RuntimeError
        If no Spark session is available.

    Examples
    --------
    >>> lh_source = get_path("Sandbox", "Source", config=CONFIG)
    >>> df = lakehouse_table_read(lh_source, "RAW_ORDERS")
    """
    if not getattr(lh, "root", None):
        raise ValueError("lh.root is required.")
    if not tablename:
        raise ValueError("tablename is required.")

    spark_obj = _get_spark(spark_session)
    path = f"{lh.root.rstrip('/')}/Tables/{tablename}"
    return spark_obj.read.format("delta").load(path)