A user asks, "For scanning 35mm film, it's better to use 48-bit instead of 24-bit, to use CinePaint rather than GIMP, right?"
Speaking generally, more bits is better when working toward high quality pro output. Where the input is 35mm film, a good scanner will extract 12 bits of information from the image at 2k. A high-end scanner (very expensive) may go deeper and justify scanning at 4k+.
If you're outputting to film or gallery-quality prints, you care about 16-bit. If you're outputting a JPEG for your website, you might prefer 16-bit just to have some extra headroom, but 8-bit will do fine except in special cases (e.g., B&W photography).
If you scan at 8 bits per channel, you can use whatever edit tool you like. Everything supports 8-bit. If you scan at 16 bits per channel you need a tool that can open that format. Some 8-bit apps can open a 16-bit TIFF, only to crush it into 8-bit. You have to use CinePaint or some other 16-bit editing app if you need a 16-bit workflow.
Is a 16-bit pro workflow necessary for your purposes or is 8-bit good enough? Is your scanner good enough to support a 16-bit workflow? Can you see any difference when you output back to film? These are questions you need to research yourself. You'll have to test it to see.
Let us know what you find...
Love you guys!