Skip to content

_get_spark

Internal helper
This page documents an internal implementation helper, not a primary public API.

Return an explicit Spark session or the active notebook global spark.

Most Fabric notebooks already expose a global spark object. Tests and local scripts can pass spark_session explicitly to avoid relying on the notebook runtime.

Parameters:

Name Type Description Default
spark_session object

Spark session to use instead of the notebook global spark.

None

Returns:

Type Description
object

Spark session object.

Raises:

Type Description
RuntimeError

If no Spark session is passed and no global spark object exists.

Source code in src/fabricops_kit/fabric_io.py
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
def _get_spark(spark_session=None):
    """Return an explicit Spark session or the active notebook global `spark`.

    Most Fabric notebooks already expose a global `spark` object. Tests and
    local scripts can pass `spark_session` explicitly to avoid relying on the
    notebook runtime.

    Parameters
    ----------
    spark_session : object, optional
        Spark session to use instead of the notebook global `spark`.

    Returns
    -------
    object
        Spark session object.

    Raises
    ------
    RuntimeError
        If no Spark session is passed and no global `spark` object exists.
    """
    if spark_session is not None:
        return spark_session
    try:
        return globals()["spark"]
    except KeyError as exc:
        raise RuntimeError(
            "Spark session was not provided and global 'spark' was not found. "
            "Run this inside Fabric/Spark or pass spark_session explicitly."
        ) from exc