Whats this rock!
  • Getting Started
  • Tutorial
  • Resources
    • Telegram Rock Classifier Chatbot
    • Keras-CV
    • Tensorflow (tutorial)
    • nbdev (docs)
  • Help
    • Report an Issue

Training utilities

  • Download
    • Download dataset
    • Download utilities
  • Preprocess
    • Preprocess Data
  • Exploratory Data Analysis
    • Exploratory Analysis
  • Config Management
    • Hydra
  • Training models
    • Training
    • Training utilities
    • Training models
    • Callbacks
  • MLOps
    • Experiment Tracking
    • HyperParameter Tuning
    • Model Management
  • Telegram Bot
    • Telegram bot deployment

On this page

  • get_lr_scheduler
  • get_model_weights
  • get_optimizer

Report an issue

Training utilities

Utilities

source

get_lr_scheduler

 get_lr_scheduler (cfg, lr)

Return A LearningRateSchedule.

Supports cosine_decay, exponentialdecay and cosine_decay_restarts.

Type Details
cfg cfg (omegaconf.DictConfig): Hydra Configuration
lr str learning rate
Returns keras.api._v2.keras.optimizers.schedules A LearningRateSchedule

source

get_model_weights

 get_model_weights
                    (train_ds:tensorflow.python.data.ops.dataset_ops.Datas
                    etV2)

Return model weights dict.

Type Details
train_ds DatasetV2 Tensorflow Dataset.
Returns dict Dictionary of class weights.
/opt/homebrew/Caskroom/miniconda/base/envs/rocks/lib/python3.9/site-packages/fastcore/docscrape.py:225: UserWarning: Unknown section Raises
  else: warn(msg)

source

get_optimizer

 get_optimizer (cfg, lr:str)

Get optimizer set with an learning rate.

Type Details
cfg cfg (omegaconf.DictConfig): Hydra Configuration
lr str learning rate
Returns keras.api._v2.keras.optimizers Tensorflow optimizer