Library/bertForked

google-research/bert

bert

TensorFlow code and pre-trained models for BERT

Builder

google-research

google-research

google-research • individual

Stars

39,950

Using upstream star count

Forks

9,700

Using upstream fork count

Open Issues

0

Activity Score

0/100

0 commits in 30d

Created

Oct 25, 2018

Project creation date

README Summary

BERT (Bidirectional Encoder Representations from Transformers) is a method of pre-training language representations which obtains state-of-the-art results on a wide array of Natural Language Processing tasks. This repository contains TensorFlow implementation and pre-trained models that can be fine-tuned for various downstream tasks. The models are trained on large amounts of text data and can understand context from both directions in a sentence.

AI Dev Skills

Unmapped

Transformer ArchitectureBidirectional Language ModelingPre-training and Fine-tuningNatural Language UnderstandingTransfer LearningAttention MechanismsMasked Language ModelingNext Sentence Prediction

Tags

Transformer ArchitectureBidirectional Language ModelingPre-training and Fine-tuningNatural Language UnderstandingTransfer LearningAttention MechanismsMasked Language ModelingNext Sentence PredictionNamed Entity RecognitionSelf-hostedLanguage UnderstandingCloud APISentiment AnalysisText ClassificationText SimilarityToken ClassificationQuestion AnsweringFoundation ModelsTextOn-premiseDocument ClassificationPre-trained Language ModelsPython

Taxonomy

Recent Activity

Updated 1 years ago

7 Days

0

30 Days

0

90 Days

0

Quality

production
Quality
high
Maturity
production

Categories

NLP & TextPrimarySearch & KnowledgeOther AI / MLFoundation ModelsModel Training

PM Skills

Scale & Reliability

Languages

Python100.0%

Timeline

Project created
Oct 25, 2018
Forked
Mar 22, 2026
Your last push
1 years ago
Upstream last push
1 years ago
Tracked since
Jul 23, 2024

Similar Repos

pgvector cosine similarity · $0

Loading…