Open
Conversation
There was a problem hiding this comment.
Pull Request Overview
This PR simplifies LlamaDiskCache by removing custom LRU management and relying on the built-in LRU behavior of the underlying diskcache library.
- Replaced
pop/pushin__getitem__with a simpleget - Removed manual eviction loop in
__setitem__and addedclose()call - Eliminated redundant deletion logic in
__setitem__
Comments suppressed due to low confidence (2)
llama_cpp/llama_cache.py:142
- [nitpick] This debug
printto stderr may clutter logs in production. Consider removing it or replacing it with a proper logger call at an appropriate log level.
print("LlamaDiskCache.__setitem__: called", file=sys.stderr)
llama_cpp/llama_cache.py:144
- With the removal of manual eviction logic, add or update tests to verify that the diskcache's default LRU eviction behaves as expected under capacity pressure.
self.cache[key] = value
| key_to_remove = next(iter(self.cache)) | ||
| del self.cache[key_to_remove] | ||
| print("LlamaDiskCache.__setitem__: trim", file=sys.stderr) | ||
| self.cache.close() |
There was a problem hiding this comment.
Calling close() on the cache on every __setitem__ invocation will close the underlying DB and prevent further operations (and degrade performance). Move close() to a teardown or destructor method instead of inside the setter.
Suggested change
| self.cache.close() |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
The LlamaDiskCache tried to implement LRU logic, but did not succeed. getitem did a pop() from the cache, but no push (that was commented out). This results in a miss-hit-miss-hit-... behavior if the same prompt is executed repeatedly.
setitem tried to reorder things, which does not make sense if there is always only one or zero elements.
The solution is as simple as it gets: The used "diskcache" already has implemented LRU behavior by default, so LlamaDiskCache does not need to do anything, just "get" and "set".