-
-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: fetch use concurrency #22
Conversation
Don't merge this Pr. Unless I done the second task. |
Codecov Report
@@ Coverage Diff @@
## master #22 +/- ##
=======================================
Coverage 75.00% 75.00%
=======================================
Files 2 2
Lines 20 20
=======================================
Hits 15 15
Misses 4 4
Partials 1 1
Flags with carried forward coverage won't be shown. Click here to find out more. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
@zaunist cc. |
I decide use unbuffered channel. Becasue i think send and accept should be separated, this is why i don't use buffered channel to do. You can view the code. we limit the goroutine queue number. |
/LGTM |
Background
Description
Others
In past. We only use one goroutine. In main process. But when we use
grm test
command. We can found the speed don't ideal. So we should speed up the program with goroutine. According my thoughts. I should mock a func just like in javaScriptPromise.All
to done this work.Why
Promise.All
. Because we should guarantee the order of the output.