Skip to content

server: fix warning message [easy] #11839

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Feb 13, 2025
Merged

Conversation

okuvshynov
Copy link
Contributor

There was a typo-like error, which would print the same number twice if request is received with n_predict > server-side config.

Before the fix:

slot launch_slot_: id  0 | task 0 | n_predict = 4096 exceeds server configuration, setting to 4096

After the fix:

slot launch_slot_: id  0 | task 0 | n_predict = 8192 exceeds server configuration, setting to 4096

Make sure to read the contributing guidelines before submitting a PR

There was a typo-like error, which would print the same number twice if
request is received with n_predict > server-side config.

Before the fix:
```
slot launch_slot_: id  0 | task 0 | n_predict = 4096 exceeds server configuration, setting to 4096
```

After the fix:
```
slot launch_slot_: id  0 | task 0 | n_predict = 8192 exceeds server configuration, setting to 4096
```
@okuvshynov
Copy link
Contributor Author

ci:

tmp/results % cat * | grep failed
100% tests passed, 0 tests failed out of 29
100% tests passed, 0 tests failed out of 29
100% tests passed, 0 tests failed out of 2
100% tests passed, 0 tests failed out of 2
100% tests passed, 0 tests failed out of 29
100% tests passed, 0 tests failed out of 29
100% tests passed, 0 tests failed out of 29
100% tests passed, 0 tests failed out of 29
100% tests passed, 0 tests failed out of 2
100% tests passed, 0 tests failed out of 2
100% tests passed, 0 tests failed out of 2
100% tests passed, 0 tests failed out of 2

@ggerganov ggerganov merged commit e437627 into ggml-org:master Feb 13, 2025
1 check passed
tinglou pushed a commit to tinglou/llama.cpp that referenced this pull request Feb 13, 2025
There was a typo-like error, which would print the same number twice if
request is received with n_predict > server-side config.

Before the fix:
```
slot launch_slot_: id  0 | task 0 | n_predict = 4096 exceeds server configuration, setting to 4096
```

After the fix:
```
slot launch_slot_: id  0 | task 0 | n_predict = 8192 exceeds server configuration, setting to 4096
```
orca-zhang pushed a commit to orca-zhang/llama.cpp that referenced this pull request Feb 26, 2025
There was a typo-like error, which would print the same number twice if
request is received with n_predict > server-side config.

Before the fix:
```
slot launch_slot_: id  0 | task 0 | n_predict = 4096 exceeds server configuration, setting to 4096
```

After the fix:
```
slot launch_slot_: id  0 | task 0 | n_predict = 8192 exceeds server configuration, setting to 4096
```
arthw pushed a commit to arthw/llama.cpp that referenced this pull request Feb 26, 2025
There was a typo-like error, which would print the same number twice if
request is received with n_predict > server-side config.

Before the fix:
```
slot launch_slot_: id  0 | task 0 | n_predict = 4096 exceeds server configuration, setting to 4096
```

After the fix:
```
slot launch_slot_: id  0 | task 0 | n_predict = 8192 exceeds server configuration, setting to 4096
```
mglambda pushed a commit to mglambda/llama.cpp that referenced this pull request Mar 8, 2025
There was a typo-like error, which would print the same number twice if
request is received with n_predict > server-side config.

Before the fix:
```
slot launch_slot_: id  0 | task 0 | n_predict = 4096 exceeds server configuration, setting to 4096
```

After the fix:
```
slot launch_slot_: id  0 | task 0 | n_predict = 8192 exceeds server configuration, setting to 4096
```
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants