Skip to content

Update to bblfsh 2.9, use -drivers image #318

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Oct 11, 2018

Conversation

carlosms
Copy link
Contributor

Use bblfsh -drivers image in docker-compose, update travis bblfsh to 2.9.

Signed-off-by: Carlos Martín <carlos.martin.sanchez@gmail.com>
@carlosms carlosms requested review from smola and smacker October 10, 2018 16:31
@smacker
Copy link
Contributor

smacker commented Oct 10, 2018

why don't we use the same drivers image in travis?

@carlosms
Copy link
Contributor Author

To avoid making it slower. If I'm not mistaken travis won't cache the docker image.
For -drivers it's 480MB vs the normal release at 76MB. The go-driver image is another 14MB, but this one should be cached.

@smacker
Copy link
Contributor

smacker commented Oct 10, 2018

Travis downloads cache too :-D Though 480mb is a lot. Thanks for the clarification!

@carlosms
Copy link
Contributor Author

🤔 can you point me where to read about travis caching docker images? All I could find requires some workaround (like the ones shared in travis-ci/travis-ci#5358)

@smacker
Copy link
Contributor

smacker commented Oct 10, 2018

I mean when you use travis cache it gets uploaded to a cloud and on each build travis downloads it. So there is no much difference if you download from dockerhub or use travis cache. It's going to be downloaded from amazon s3/google cloud storage anyway.

@carlosms
Copy link
Contributor Author

I see what you mean now 👍

@carlosms carlosms merged commit df986d3 into src-d:master Oct 11, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants