Hi, this is more like a question than an issue.
How could one do a pretraining run, without a downstream task?
What I want to achieve is to pretrain a model (using maybe both contrastive losses) and then use this model for various downstream tasks later.
Another question would be, when pretraining, do we pretrain on all train, validation, and test datasets? or should we limit it to training dataset?