2018.10.30 Hangout


We will be having our weekly hangout today, at 10:00 AM EST.

On the agenda:

Anyone is welcome to join to ask questions at https://bit.ly/slicer-googlemeet-hosted-by-kitware

Thanks !

Hi Sam, Jc & all -

I’ll be at the Qt conference today so I’ll miss today’s hangout. Great work on the 4.10 release!


Ditto on the good work finalizing the release!

In addition to talking about future documentation methods, if you have time on the agenda, maybe you and others can revisit the discussion about a potential transition to Github issue tracking as well. This period after a release is probably a good time to revisit this topic. I posted a comment yesterday based on my recent experience using Mantis.


Meeting notes:

  • @smrolfe joined us and asked question related to Markups Module enhancements

    • from @lassoan:
      • For each handle, an actor is created. We work well up to 100th of points. For thousand of points, the handle widget approach doesn’t scale well.
      • We are working on a revamped infrastructure but this is not ready. In the mean time, implementing small improvements is the way to go. Then, we can still integrate the approach into the updated infrastructure.
  • documentation

    • “restrict” lfs upload for documentation. Probably only the "/docs/ folder. See https://github.com/Slicer/Slicer/pull/686
    • @lassoan will try Git lfs and help answer questions like “We image be uploaded though the web interface and automatically added to Git LFS”
  • Transition to GitHub

    • (1) we plan on creating an other repository (e.g Slicer-Archive ) that would have the unmodified history for future reference.
    • (2) we will move current data files to data.kitware.com and other mirror to allow content-addressable download.
    • (3) we will trim the current Slicer/Slicer repo to remove large data. See list of 350th largest files (generated using this script)
  • Transition Issue tracker:
    • after transitioning to GitHub (see above), new issues will be added to GitHub and user will be cased to do so.
    • existing issue would be manually “transferred” to GitHub on a case-by-case.
    • we would not spend time migrating existing issues.

Regarding using git-lfs for documentation:

I’ve done some tests - see details here: Should we use Git LFS to manage data?

In summary: git-lfs is not fully supported by GitHub web interface (for example, cannot upload git-lfs file trough web interface) and still not very robust (may break due to user errors, symlinks, merges, changing of git attributes, etc.).

Short term: I think we should not start using git-lfs now. Instead, we can store large documentation files (mainly screenshot files) as regular files. If we keep image sizes small then the repository size will remain manageable.

Long term: If we find that repository has become too large (not very likely to happen within a couple of years) then we can decide to move existing files to git-lfs or other solution that will be a state-of-the-art then. There is already a git-lfs command that can convert existing files to git-lfs files, so we could easily migrate any time we decide to do so.

Thanks for the detailed report, very insightful

If we keep image sizes small

What should be threshold ?

This script could be helpful to answer: https://gist.github.com/jcfr/4348af13d2c8931daeab4ff9ab73e14b

And here is the output from that same script from few months ago: https://gist.github.com/jcfr/93fe51974d9db8ef55a6d3172c1de68d

Instead of setting a size threshold, we should probably specify recommended image size, file format, and compression setting, which produces optimal images for online documentation. I guess an images would end up being a few hundred KB.

That is a great idea.

That said I still think having a test running that would fail if un-compressible data files are above X kb (e.g 250kb) would still be complementary.

It looks there are online services to compress images (https://duckduckgo.com/?q=online+compress+image), we could re command one that support copy/paste of images and allow to set its parameter from a URL.

Or host a javascript based one on a webpage we control.


Precommit hook with a file size limit would help in reducing chance of accidentally committing large files. Ultimately, it would be quality-checked manually when the pull request is merged.

Since I anticipate the documentation will be updated directly on GitHub in some case, I think in addition of the pre-commit hook, we should also have a pull-request check is still relevant.

The good news, is that with GitHub apps … these are now quite easy to setup. See https://probot.github.io/apps/ , https://probot.github.io/docs/ and https://github.com/gr2m/github-app-example

(no more need to host our own app on heroky like we do for the doxygen hooks, https://github.com/Slicer/github-circleci-trigger)

1 Like