Skip to content

Conversation

@Ayush10
Copy link

@Ayush10 Ayush10 commented Jan 30, 2026

Summary

  • Add a pretrain parameter (default True) to the HIST model's __init__ to allow users to skip the pre-training weight-loading phase
  • When pretrain=False, the weight-loading block in fit() is skipped entirely, allowing training from scratch
  • Fully backward compatible — existing configs work without changes

Problem

The HIST model always executes the pretrained weight-loading logic during fit(), even when model_path is None. This creates unnecessary overhead (instantiating a base LSTMModel/GRUModel and transferring weights) and makes it impossible for users to train the architecture from scratch without the pretrain initialization step.

Changes

  • qlib/contrib/model/pytorch_hist.py: Added pretrain=True parameter to __init__, stored as self.pretrain, and wrapped the weight-loading block in fit() with if self.pretrain:

Usage

```yaml

Existing behavior (unchanged) — pretrain weights are loaded

model:
class: HIST
kwargs:
pretrain: True # default, can be omitted
model_path: "path/to/pretrained_model.pkl"

New: skip pretrain entirely, train from scratch

model:
class: HIST
kwargs:
pretrain: False
```

Closes #2074

The HIST model always runs the pretrained weight-loading logic during
fit(), even when model_path is None. This makes it impossible for users
to skip the pre-training phase entirely when they want to train the
architecture from scratch.

Add a `pretrain` parameter (default True for backward compatibility)
that guards the entire weight-loading block in fit().

Closes microsoft#2074
@Ayush10 Ayush10 force-pushed the fix/issue-2074-hist-pretrain-toggle branch from db16085 to df9e3a5 Compare January 30, 2026 13:22
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

HIST model missing explicit pretrain toggle in initialization

1 participant