easyfabric.fabric.logging_utils
contextlib
json
logging
random
re
sys
time
uuid
datetime
timezone
Path
Optional
notebookutils
Row
StringType
StructField
StructType
TimestampType
config
get_spark
spark
MAX_MESSAGE_LEN
LOG_PATTERN
TIMESTAMP_PATTERN
GUID_PATTERN
EXCLUDED_LOGGERS
LOG_SCHEMA
is_top_level_notebook
def is_top_level_notebook() -> bool
Identifies if the current execution is the top-most notebook.
save_log_file_to_table
def save_log_file_to_table(end_log: bool = False) -> None
Reads a log file from OneLake, parses structured logs (including multiline), and bulk inserts into Meta.dbo.logging using Spark DataFrame.
Always logs an END entry with duration for the current notebook. If it's a top-level notebook or end_log is True, it also clears logging handlers, resets state, and persists log entries to the table.
save_historical_log_file_to_table
def save_historical_log_file_to_table(abfs_path: str) -> None
Parses a specific log file by ABFS path and inserts missing logs into Meta.dbo.logging.
FabricLoggerAdapter Objects
class FabricLoggerAdapter(logging.LoggerAdapter)
Adapter that automatically includes log_type and log_category in all log records.
__init__
def __init__(logger: logging.Logger,
log_type: Optional[str] = "IN PROCESS",
log_category: Optional[str] = "Technical")
process
def process(msg, kwargs)
OneLakeFileHandler Objects
class OneLakeFileHandler(logging.Handler)
MAX_RETRIES
BASE_DELAY
__init__
def __init__(path)
emit
def emit(record)
SafeFormatter Objects
class SafeFormatter(logging.Formatter)
Formatter that ensures custom fields exist to prevent KeyErrors from third-party libraries.
format
def format(record)
formatTime
def formatTime(record, datefmt=None)
Include milliseconds in the timestamp for better sorting.
to_snake_case
def to_snake_case(string: str) -> str
Convert a string from camel case to snake case.
set_verbose_mode
def set_verbose_mode(enabled=True)
Enable or disable verbose logging mode globally.
log_segment
@contextlib.contextmanager
def log_segment(type: str, name: str)
Context manager to log the start and end of a logic segment. Usage: with log_segment("Data Load", "Bronze Loading"): ... logic ...
init_logging
def init_logging(log_source: str = "Sys", log_object: str = None) -> str
Call once at the very top of the entry-point notebook / wheel. Returns the absolute OneLake path of the log file.
get_log_file_path
def get_log_file_path() -> Optional[str]
Returns the path of the current log file. Checks singleton state, global config, and active handlers to ensure reliability even in nested notebooks or after module reloads.
extract_real_error
def extract_real_error(log_text: str) -> str
Extracts the most relevant error message from a Spark stack trace.