Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for properly interpreting context.selectedCompletionInfo #66

Closed
spew opened this issue Jan 26, 2024 · 3 comments
Closed

Add support for properly interpreting context.selectedCompletionInfo #66

spew opened this issue Jan 26, 2024 · 3 comments

Comments

@spew
Copy link

spew commented Jan 26, 2024

When vscode shows a popup Completion item (i.e. what they used to call intellisense: a regular language syntax or function that vscode knows about), any inline completion is supposed to start with the Completion item. That is to say, the completion item should be added to the end of the prefix. Take the following python example:

file_path = '/tmp/my-file'
with open(file_path, "r") as handle:
   # imagine the developer is in the middle of typing the period below
   obj = json.
   if obj.myField:
       print('my field is present')

So imagine the developer is typing the . in the line obj = json., vscode will pop up possible completions for json, and likely the method loads will be the top completion. The prefix that is sent to the LLM should use a value of obj = json.loads for that line. The suffix that comes after should also be included as normal.

The range that should be returned for the vscode.InlineCompletionItem should be properly adjusted for this as well. However, this portion is probably not related to this project.

@McPatate
Copy link
Member

Hi @spew, you should open an issue in https://github.com/huggingface/llm-vscode, we'll deal with it there.

Thanks for reporting!

@spew
Copy link
Author

spew commented Jan 29, 2024

I don't think llm-vscode can properly handle it without some help from this project.

@spew
Copy link
Author

spew commented Jan 29, 2024

Created huggingface/llm-vscode#127

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants