Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature request: Turbo mode #22

Open
deepumukundan opened this issue Mar 1, 2016 · 4 comments
Open

Feature request: Turbo mode #22

deepumukundan opened this issue Mar 1, 2016 · 4 comments

Comments

@deepumukundan
Copy link

Currently gitup sequentially pulls repo's one by one inside the passed in folder. What if there was a turbo mode which executes git pull on all the repo's concurrently? The tty print can be suppressed in that case and maybe a report generated (since there could be theoretically 100'ds of repo's being updated at the same time). Just a thought. Python is not my forte so can't create a PR with the suggested change :(.

@earwig
Copy link
Owner

earwig commented Mar 2, 2016

Good idea, though we want to be careful to not flood github with 20 concurrent pulls—I don't know if that would bother them.

@deepumukundan
Copy link
Author

I have seen this behaviour in Github Desktop, their own Git client when trying to pull from the swift repos. An activity indicator spins on the repo names in the left sidebar and completes once each update is complete.

But agree with you if the pulls will bother them. Maybe we should not pull everything together but run batches of pulls concurrently. Anyways gitup as it is works well for me, so can't really complain :)

@mcameron
Copy link

I would prefer it to the following if it performed comparably:

for file in $(ls -1); do cd ${file}; git pull; cd ..; done

Thank you for making it available.

On a clean tree of repos "for loop":
2.36 real 0.53 user 0.34 sys

Same tree "gitup":
4.34 real 0.43 user 0.36 sys

On an out of date tree gitup can take significantly (minutes) longer for me.

@JedMeister
Copy link

FWIW prior to using your awesome git-repo-updater, I had a bash script which via xargs updated repos 20 at a time (from GitHub) and I never had any issues.

It made it incredibly fast (compared to doing it sequentially). I tested using more than 20 concurrent connections and whilst it did still increase the speed, the returns were diminishing so 20 seemed like the sweet spot. YMMV though...

Regardless, I never had any complaints from GitHub. IIRC at some point when I was testing larger numbers of concurrent connections, I started to get (what appeared to be random) failures. If memory serves, that was at some point above 100 concurrent connections to GH.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants