-
Notifications
You must be signed in to change notification settings - Fork 13
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fixed: limit on number of packages + add rworkflows
#67
base: master
Are you sure you want to change the base?
Conversation
Site is back up now. |
@gaborcsardi have you (or any other maintainers of Thanks, |
Thanks for the PR, but I am not going to merge this, sorry. I prefer using our workflows instead of rworkflows. As for the parallelization, I prefer not to use parallel for HTTP. |
Ok, I suppose that's up to you, but would you mind providing some justifications for the benefit of users who requested some of these features? eg #56
It seems your workflows has been failing for several years now, unless I'm missing something. If you were to fix this i think it would be a decent alternative.
Is there a technical reason for this? I'd be curious to know myself! Similarly, is there an alternative implementation that would suit this purpose better? If you do not wish to support parallelization at all, you have some other options available to you (in order of preference):
Happy to adjust the PR as needed, just let me know what you'd like to accomplish with this package. |
I prefer using
Yes, several. Using multiple processes for concurrent HTTP is an anti-pattern. However, in general, I am not sure if I want to provide a quick way for people to "rank" all CRAN packages according to downloads. I don't really want people to think that this is an indication of package quality. |
Haven't come across this one! I'll check it out though. As you may already be aware, there's also Even so, if you're no longer using Travis, perhaps it's time to retire it from here?
This is great information to have! Thanks for the explanation.
Hm, I hadn't thought of it that way. I personally don't agree with this reasoning as I think it's best to give the users the complete data and make decisions about how to responsibly use it themselves (perhaps with a disclaimer in the documentation). For example, in my case I'm using the downloads to assess package usage, but I wouldn't think to use this as a metric of quality (not sure how many other users would?). Would you mind updating this thread here to let the other users know of your decision either way? Thanks! #56 |
cran_downloads
and prevent it from failing with many packages:limit on number of packages as argument to cran_downloads #56
cran_downloads
with parallelisation.cran_downloads
with lots of packages.dontrun{}
fromcran_downloads
andcran_top_downloads
examples.Unclear why this was here?
message_parallel
,split_batches
rworkflows
.All you need to do on your end @gaborcsardi is add GH token as a GH secret named "PAT_GITHUB" to the
cranlogs
repo.Best,
Brian