easyfabric.fabric.logging_utils
json
logging
random
re
sys
time
uuid
datetime
timezone
Path
Optional
notebookutils
Row
StringType
StructField
StructType
TimestampType
activity_id
config
get_spark
spark
MAX_MESSAGE_LEN
LOG_PATTERN
TIMESTAMP_PATTERN
LOG_SCHEMA
get_current_datetime
def get_current_datetime()
get_current_date
def get_current_date()
get_current_time
def get_current_time()
save_log_file_to_table
def save_log_file_to_table() -> None
Reads a log file from OneLake, parses structured logs (including multiline), and bulk inserts into Meta.dbo.logging using Spark DataFrame.
OneLakeFileHandler Objects
class OneLakeFileHandler(logging.Handler)
MAX_RETRIES
BASE_DELAY
__init__
def __init__(path)
emit
def emit(record)
SafeFormatter Objects
class SafeFormatter(logging.Formatter)
Formatter that ensures custom fields exist to prevent KeyErrors from third-party libraries.
format
def format(record)
CustomObjectFilter Objects
class CustomObjectFilter(logging.Filter)
__init__
def __init__(custom_objectname)
filter
def filter(record)
CustomSourceFilter Objects
class CustomSourceFilter(logging.Filter)
__init__
def __init__(custom_source)
filter
def filter(record)
to_snake_case
def to_snake_case(string: str) -> str
Convert a string from camel case to snake case.
set_verbose_mode
def set_verbose_mode(enabled=True)
Enable or disable verbose logging mode globally.
get_logging_defaults
def get_logging_defaults(log_file: str)
Get the current logging defaults.
init_logging
def init_logging(log_source: str = "Sys", log_object: str = None) -> str
Call once at the very top of the entry-point notebook / wheel. Returns the absolute OneLake path of the log file.
get_log_file_path
def get_log_file_path() -> Optional[str]
Return the path used by the current run (or None if not init'd).
set_logging_defaults
def set_logging_defaults(log_source: str = "Sys",
log_object: str = "Sys",
log_file: str = None) -> logging.Logger
Configure default logging settings.
Arguments:
log_source- The source of the log.log_object- The name of the object to log.log_file- Path to the log file. If None, a temporary file will be used.
extract_real_error
def extract_real_error(log_text: str) -> str
Extracts the most relevant error message from a Spark stack trace.