You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
tradeoff b/w economical vs. meaningful content (e.g., using "real life" datasets)
some methods (e.g., deconvolution, workflows) take looong, even with subsampling
we could stash precomputed results on OSF, but this defeats the purpose
of dynamic content (risk of being outta synch and non-reproducible)
it'd be nice if we should track the timing of all things
to help identify problematic code chunks, chapters etc.
how did OSCA handle this? do we need to "just deal with it"?
can we parallelize the build? at least seq- and img- are fully independent,
so are workflows... the BBS won't put up with it, but it'd help during development
The text was updated successfully, but these errors were encountered:
of dynamic content (risk of being outta synch and non-reproducible)
to help identify problematic code chunks, chapters etc.
seq-
andimg-
are fully independent,so are workflows... the BBS won't put up with it, but it'd help during development
The text was updated successfully, but these errors were encountered: