BERT-STEM
BERT model fine-tuned on Science Technology Engineering and Mathematics (STEM) lessons.
Install:
To install from pip:
pip install bertstem
Quickstart
To encode sentences :
from BERT_STEM.BertSTEM import *
bert = BertSTEM()
data = {'col_1': [3, 2, 1],
'col_2': ['hola como estan', 'alumnos queridos', 'vamos a hablar de matematicas']}
df = pd.DataFrame.from_dict(data)
bert._encode_df(df, column='col_2', encoding='sum')
To classify sentences with COPUS models:
from BERT_STEM.BertSTEM import *
bert_classification = BertSTEMForTextClassification(2, model_name = 'pablouribe/bertstem-copus-guiding')
data = {'col_1': [3, 2, 1],
'col_2': ['hola como estan', 'alumnos queridos', 'vamos a hablar de matematicas']}
df = pd.DataFrame.from_dict(data)
bert_classification.predict(df,'col_2')
To use it from HuggingFace:
from BERT_STEM.Encode import *
import pandas as pd
import transformers
model = transformers.BertModel.from_pretrained("pablouribe/bertstem")
tokenizer = transformers.BertTokenizerFast.from_pretrained("dccuchile/bert-base-spanish-wwm-uncased",
do_lower_case=True,
add_special_tokens = False)
data = {'col_1': [3, 2, 1],
'col_2': ['hola como estan', 'alumnos queridos', 'vamos a hablar de matematicas']}
df = pd.DataFrame.from_dict(data)
sentence_encoder(df, model, tokenizer, column = 'col_2', encoding = 'sum')