Launch Week Day 5: Introducing Reachability for PHP.Learn More
Socket
Book a DemoSign in
Socket

012shin/qlora-koalpaca-polyglot-12.8b-50step

Package Overview
Dependencies
Contributors
1
Versions
3
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

012shin/qlora-koalpaca-polyglot-12.8b-50step

Hugging FaceHugging Face Hub
Version
594b26a
Version published
Monthly downloads
0
Contributors
1
Source

qlora-koalpaca-polyglot-12.8b-50step

This model is a fine-tuned version of beomi/KoAlpaca-Polyglot-12.8B on the None dataset.

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0002
  • train_batch_size: 4
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 8
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_ratio: 0.03
  • training_steps: 50
  • mixed_precision_training: Native AMP

Training results

Framework versions

  • PEFT 0.11.2.dev0
  • Transformers 4.42.0.dev0
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.2
  • Tokenizers 0.19.1

FAQs

Package last updated on 03 Jun 2024

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts