Edit model card

README - long-t5-tglobal-base-16384-booksum-V11-big_patent-V2

  • this README was added because there wasn't one
  • created 2022-07-31_12-14-50

about

An experiment testing some transfer learning with pszemraj/long-t5-tglobal-base-16384-book-summary to evaluate the ability to learn some technical documentation through the big_patent dataset on modeldatabase.

This checkpoint has been trained on dataset subsection y of big_patent for approx 400 steps of functional batch size 128.

Downloads last month
103
Safetensors
Model size
248M params
Tensor type
F32
·
Hosted inference API
Summarization
Examples
This model can be loaded on the Inference API on-demand.

Datasets used to train pszemraj/long-t5-tglobal-base-16384-booksum-V11-big_patent-V2

Space using pszemraj/long-t5-tglobal-base-16384-booksum-V11-big_patent-V2 1

Evaluation results