I'm creating an application that has a single Entrypoint, the application user has 10 parameters to fill, only 3 are optional. These parameters are filled in an object called ContextConfig. When the application starts there 10 parameters are delivered for 3 objects and different objects takes repetitive parameters, are there any pattern to this?
class DeltaContext:
def __init__(self, delta_context: DeltaContextConfig):
self.delta_context = delta_context
self.hive_operations = HiveOperations(
hive_config=HiveConfig(
spark_df=self.delta_context.spark_df,
table_name=self.delta_context.table_name,
database_name=self.delta_context.database_name,
partition_column=self.delta_context.partition_column,
destination_path=self.delta_context.destination_path,
spark_context=self.delta_context.spark_context,
)
)
self.delta_writer = DeltaWriter(
delta_write_config=DeltaWriteConfig(
spark_df=self.delta_context.spark_df,
column_to_order=self.delta_context.column_to_order,
destination_path=self.delta_context.destination_path,
partition_column=self.delta_context.partition_column,
primary_key=self.delta_context.primary_key,
files_count_first_batch=self.delta_context.files_count_first_batch,
spark_context=delta_context.spark_context,
operation_type=delta_context.operation_type,
update_condition=delta_context.update_condition,
set_expression=delta_context.set_expression
)
)
self.auto_compaction = AutoCompaction(
auto_compaction_config=AutoCompactionConfig(
spark_context=self.delta_context.spark_context,
destination_path=self.delta_context.destination_path,
partition_column=self.delta_context.partition_column,
ideal_file_size=self.delta_context.ideal_file_size,
compaction_interval_time=self.delta_context.compaction_interval_time,
)
)
How I could turn it better?
thank you
Aucun commentaire:
Enregistrer un commentaire