You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I noticed that there exists intel-extension-for-transformers, though intel-extension-for-pytorch also allows the deployment of LLMs. What is the difference between the two, and which one is best for deploying LLama2-70b?