PatchTST-Hourly-Electricity-Demand-Brazil

This model is a fine-tuned version of PatchTST model on the Brazilian Hourly Electricity Demand dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1926 Graph

Model description

PatchTST is a Transformer-based architecture optimized for univariate and multivariate time series forecasting. It introduces a patching mechanism (inspired by Vision Transformers) to capture local temporal patterns, enhancing both performance and efficiency for long input sequences.

This model was fine-tuned for the task of predicting hourly electricity demand in Brazil, more specifically, the North East region, and demonstrates robust performance over the test set.

Intended uses & limitations

This model is best suited for:

  • Forecasting electricity demand at an hourly resolution.
  • The model was trained on historical demand data and may not generalize well to future patterns with unseen anomalies (e.g., pandemics, blackouts).
  • Exogenous variables (like temperature, holidays, or economic activity) were not included in this training version.

Training and evaluation data

The model was trained using the Hourly-Electricity-Demand-Brazil dataset, which contains hourly energy demand data from 2015 to 2024.

  • Input: Historical demand time series in hourly resolution.
  • Target: Future electricity demand, predicted for a defined forecasting window.

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 128
  • eval_batch_size: 128
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss
0.4183 1.0 249 0.4237
0.2361 2.0 498 0.3208
0.1963 3.0 747 0.2892
0.1809 4.0 996 0.2761
0.1714 5.0 1245 0.2682
0.1637 6.0 1494 0.2611
0.1564 7.0 1743 0.2533
0.1508 8.0 1992 0.2464
0.1461 9.0 2241 0.2444
0.1421 10.0 2490 0.2392
0.1387 11.0 2739 0.2367
0.1361 12.0 2988 0.2364
0.1335 13.0 3237 0.2312
0.1315 14.0 3486 0.2310
0.1296 15.0 3735 0.2310
0.1282 16.0 3984 0.2295
0.1266 17.0 4233 0.2277
0.1256 18.0 4482 0.2255
0.1247 19.0 4731 0.2268
0.1239 20.0 4980 0.2285
0.1232 21.0 5229 0.2252
0.1223 22.0 5478 0.2260
0.1215 23.0 5727 0.2225
0.121 24.0 5976 0.2232
0.1204 25.0 6225 0.2253
0.1202 26.0 6474 0.2268
0.1195 27.0 6723 0.2243
0.119 28.0 6972 0.2205
0.1186 29.0 7221 0.2203
0.118 30.0 7470 0.2228
0.1174 31.0 7719 0.2235
0.1171 32.0 7968 0.2217
0.1167 33.0 8217 0.2193
0.1162 34.0 8466 0.2221
0.1157 35.0 8715 0.2223
0.1155 36.0 8964 0.2195
0.115 37.0 9213 0.2183
0.1146 38.0 9462 0.2230
0.1142 39.0 9711 0.2242
0.1141 40.0 9960 0.2214
0.1138 41.0 10209 0.2235
0.1134 42.0 10458 0.2227
0.113 43.0 10707 0.2201
0.1128 44.0 10956 0.2256
0.1126 45.0 11205 0.2197
0.1123 46.0 11454 0.2250
0.112 47.0 11703 0.2218

Framework versions

  • Transformers 4.51.1
  • Pytorch 2.6.0+cu124
  • Datasets 3.5.0
  • Tokenizers 0.21.1
Downloads last month
9
Safetensors
Model size
138k params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Dataset used to train SamuelM0422/PatchTST-Hourly-Electricity-Demand-Brazil

Collections including SamuelM0422/PatchTST-Hourly-Electricity-Demand-Brazil

Evaluation results