Skip to content

Commit 0fa1b0b

Browse files
danbevmglambda
authored andcommitted
server : use common_token_to_piece instead of common_detokenize (ggml-org#11740)
* server : use common_token_to_piece instead of common_detokenize This commit replaces the call to common_detokenize with common_token_to_piece in the populate_token_probs. The motivation for this change is to avoid an issue where common_detokenize would remove the word boundary character for tokens, which caused a regression in the server generated token probabilities. Resolves: ggml-org#11728 * squash! server : use common_token_to_piece instead of common_detokenize Use common_token_to_piece for post_sampling_probs as well.
1 parent 52c8287 commit 0fa1b0b

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

examples/server/server.cpp

+2-2
Original file line numberDiff line numberDiff line change
@@ -2279,7 +2279,7 @@ struct server_context {
22792279
for (size_t i = 0; i < std::min(max_probs, n_probs); i++) {
22802280
result.probs.push_back({
22812281
cur_p->data[i].id,
2282-
common_detokenize(ctx, {cur_p->data[i].id}, special),
2282+
common_token_to_piece(ctx, cur_p->data[i].id, special),
22832283
cur_p->data[i].p
22842284
});
22852285
}
@@ -2301,7 +2301,7 @@ struct server_context {
23012301
for (size_t i = 0; i < std::min(n_vocab, n_probs); i++) {
23022302
result.probs.push_back({
23032303
cur[i].id,
2304-
common_detokenize(ctx, {cur[i].id}, special),
2304+
common_token_to_piece(ctx, cur[i].id, special),
23052305
cur[i].p
23062306
});
23072307
}

0 commit comments

Comments
 (0)