Hello, I'm looking at including SPM12[1] as part of NeuroFedora[2]. It's primarily written for Matlab, but it works on Octave as well. SPM upstream also works with Octave upstream to improve Matlab compatibility etc. Unfortunately, as a more Matlab centric package, SPM is not set up to be set up to be installed as an Octave package like ones hosted at Octave forge[3]. Currently, one simply runs `make && make install` in the src directory anywhere on one's system, and then either uses the "spm12-octave" script to run the "standalone" version of SPM, or runs it from octave using "addpath (/path/to/spm12/build)". The source does not contain the required files to use `pkg build, pkg install` etc. as the template spec and octave macros suggest[4]. So, how should one approach this? - Should I stick to `make && make install` and then manually install the files (so, not use `pkg ...` at all)? * If so, where should the files be installed? Should they be installed in the standard octave directories, or in the general FHS locations? - Should I add the bits to make it a package (locally in the spec), and then use the rpm macros (and so treat it like a standard octave package)? This will need a little bit of work downstream, and upstream will most likely not accept these Octave specific bits. [1] https://www.fil.ion.ucl.ac.uk/spm/software/spm12/ [3] https://github.com/spm/spm12 [2] https://neuro.fedoraproject.org [4] https://docs.fedoraproject.org/en-US/packaging-guidelines/Octave -- Thanks, Regards, Ankur Sinha "FranciscoD" https://fedoraproject.org/wiki/User:Ankursinha Time zone: Europe/London
Attachment:
signature.asc
Description: PGP signature
_______________________________________________ scitech mailing list -- scitech@xxxxxxxxxxxxxxxxxxxxxxx To unsubscribe send an email to scitech-leave@xxxxxxxxxxxxxxxxxxxxxxx Fedora Code of Conduct: https://getfedora.org/code-of-conduct.html List Guidelines: https://fedoraproject.org/wiki/Mailing_list_guidelines List Archives: https://lists.fedoraproject.org/archives/list/scitech@xxxxxxxxxxxxxxxxxxxxxxx