Skip to content

[Model] support VoRA model #616

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Apr 6, 2025
Merged

[Model] support VoRA model #616

merged 1 commit into from
Apr 6, 2025

Conversation

sty-yyj
Copy link
Contributor

@sty-yyj sty-yyj commented Apr 4, 2025

This PR introduces support for VoRA, a novel paradigm for transforming an LLM into an MLLM.
Below is a screenshot of the evaluation result on realwordqa task:
image

@sty-yyj sty-yyj marked this pull request as draft April 4, 2025 15:40
@sty-yyj sty-yyj marked this pull request as ready for review April 4, 2025 15:41
@Luodian Luodian requested a review from Copilot April 6, 2025 12:01
Copy link

@Copilot Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copilot reviewed 2 out of 2 changed files in this pull request and generated 2 comments.

Comments suppressed due to low confidence (1)

lmms_eval/models/vora.py:155

  • The attribute 'self.task_dict' is used but not defined in the class; please ensure it is properly initialized or passed.
visuals = [doc_to_visual[0](self.task_dict[task][split][ids]) for ids in doc_id]


@property
def max_length(self):
return self._max_length
Copy link
Preview

Copilot AI Apr 6, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The property 'max_length' returns 'self._max_length', which is not defined anywhere in the class; either initialize it or remove the property.

Copilot uses AI. Check for mistakes.

for text_outputs, context in zip(answers, contexts):
res.append(text_outputs)

self.cache_hook.add_partial("generate_until", (context, gen_kwargs), text_outputs)
Copy link
Preview

Copilot AI Apr 6, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The attribute 'self.cache_hook' is used without being initialized; please define 'cache_hook' before using it.

Copilot uses AI. Check for mistakes.

@Luodian Luodian merged commit 8b171bb into EvolvingLMMs-Lab:main Apr 6, 2025
1 check failed
dadwadw233 pushed a commit to dadwadw233/lmms-eval that referenced this pull request Apr 28, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants