LightningIRCLI
- class lightning_ir.main.LightningIRCLI(model_class: type[~lightning.pytorch.core.module.LightningModule] | ~typing.Callable[[...], ~lightning.pytorch.core.module.LightningModule] | None = None, datamodule_class: type[~lightning.pytorch.core.datamodule.LightningDataModule] | ~typing.Callable[[...], ~lightning.pytorch.core.datamodule.LightningDataModule] | None = None, save_config_callback: type[~lightning.pytorch.cli.SaveConfigCallback] | None = <class 'lightning.pytorch.cli.SaveConfigCallback'>, save_config_kwargs: dict[str, ~typing.Any] | None = None, trainer_class: type[~lightning.pytorch.trainer.trainer.Trainer] | ~typing.Callable[[...], ~lightning.pytorch.trainer.trainer.Trainer] = <class 'lightning.pytorch.trainer.trainer.Trainer'>, trainer_defaults: dict[str, ~typing.Any] | None = None, seed_everything_default: bool | int = True, parser_kwargs: dict[str, ~typing.Any] | dict[str, dict[str, ~typing.Any]] | None = None, subclass_mode_model: bool = False, subclass_mode_data: bool = False, args: list[str] | dict[str, ~typing.Any] | ~jsonargparse._namespace.Namespace | None = None, run: bool = True, auto_configure_optimizers: bool = True)[source]
Bases:
LightningCLI
- __init__(model_class: type[~lightning.pytorch.core.module.LightningModule] | ~typing.Callable[[...], ~lightning.pytorch.core.module.LightningModule] | None = None, datamodule_class: type[~lightning.pytorch.core.datamodule.LightningDataModule] | ~typing.Callable[[...], ~lightning.pytorch.core.datamodule.LightningDataModule] | None = None, save_config_callback: type[~lightning.pytorch.cli.SaveConfigCallback] | None = <class 'lightning.pytorch.cli.SaveConfigCallback'>, save_config_kwargs: dict[str, ~typing.Any] | None = None, trainer_class: type[~lightning.pytorch.trainer.trainer.Trainer] | ~typing.Callable[[...], ~lightning.pytorch.trainer.trainer.Trainer] = <class 'lightning.pytorch.trainer.trainer.Trainer'>, trainer_defaults: dict[str, ~typing.Any] | None = None, seed_everything_default: bool | int = True, parser_kwargs: dict[str, ~typing.Any] | dict[str, dict[str, ~typing.Any]] | None = None, subclass_mode_model: bool = False, subclass_mode_data: bool = False, args: list[str] | dict[str, ~typing.Any] | ~jsonargparse._namespace.Namespace | None = None, run: bool = True, auto_configure_optimizers: bool = True) None
Receives as input pytorch-lightning classes (or callables which return pytorch-lightning classes), which are called / instantiated using a parsed configuration file and / or command line args.
Parsing of configuration from environment variables can be enabled by setting
parser_kwargs={"default_env": True}
. A full configuration yaml would be parsed fromPL_CONFIG
if set. Individual settings are so parsed from variables named for examplePL_TRAINER__MAX_EPOCHS
.For more info, read the CLI docs.
- Parameters:
model_class – An optional
LightningModule
class to train on or a callable which returns aLightningModule
instance when called. IfNone
, you can pass a registered model with--model=MyModel
.datamodule_class – An optional
LightningDataModule
class or a callable which returns aLightningDataModule
instance when called. IfNone
, you can pass a registered datamodule with--data=MyDataModule
.save_config_callback – A callback class to save the config.
save_config_kwargs – Parameters that will be used to instantiate the save_config_callback.
trainer_class – An optional subclass of the
Trainer
class or a callable which returns aTrainer
instance when called.trainer_defaults – Set to override Trainer defaults or add persistent callbacks. The callbacks added through this argument will not be configurable from a configuration file and will always be present for this particular CLI. Alternatively, configurable callbacks can be added as explained in the CLI docs.
seed_everything_default – Number for the
seed_everything()
seed value. Set to True to automatically choose a seed value. Setting it to False will avoid callingseed_everything
.parser_kwargs – Additional arguments to instantiate each
LightningArgumentParser
.subclass_mode_model – Whether model can be any subclass of the given class.
subclass_mode_data –
Whether datamodule can be any subclass of the given class.
args – Arguments to parse. If
None
the arguments are taken fromsys.argv
. Command line style arguments can be given in alist
. Alternatively, structured config options can be given in adict
orjsonargparse.Namespace
.run – Whether subcommands should be added to run a
Trainer
method. If set toFalse
, the trainer and model classes will be instantiated only.
Methods
__init__
([model_class, datamodule_class, ...])Receives as input pytorch-lightning classes (or callables which return pytorch-lightning classes), which are called / instantiated using a parsed configuration file and / or command line args.
add_arguments_to_parser
(parser)add_core_arguments_to_parser
(parser)Adds arguments from the core classes to the parser.
add_default_arguments_to_parser
(parser)Adds default arguments to the parser.
Implement to run some code after instantiating the classes.
Implement to run some code before instantiating the classes.
configure_optimizers
(lightning_module, optimizer)init_parser
(**kwargs)Method that instantiates the argument parser.
Instantiates the classes and sets their attributes.
instantiate_trainer
(**kwargs)Instantiates the trainer.
Creates argument links for optimizers and learning rate schedulers that specified a
link_to
.parse_arguments
(parser, args)Parses command line arguments and stores it in
self.config
.setup_parser
(add_subcommands, main_kwargs, ...)Initialize and setup the parser, subcommands, and arguments.
subcommands
()- add_core_arguments_to_parser(parser: LightningArgumentParser) None
Adds arguments from the core classes to the parser.
- add_default_arguments_to_parser(parser: LightningArgumentParser) None
Adds default arguments to the parser.
- instantiate_trainer(**kwargs: Any) Trainer
Instantiates the trainer.
- Parameters:
kwargs – Any custom trainer arguments.
- static link_optimizers_and_lr_schedulers(parser: LightningArgumentParser) None
Creates argument links for optimizers and learning rate schedulers that specified a
link_to
.