README: update dependency information for xargs for sfeed_update - sfeed - RSS … | |
git clone git://git.codemadness.org/sfeed | |
Log | |
Files | |
Refs | |
README | |
LICENSE | |
--- | |
commit 89dbf565de7b980d2b336c58633b7a27681adaf5 | |
parent cdb8f7feb135adf6f18e389b4bbf47886089474a | |
Author: Hiltjo Posthuma <[email protected]> | |
Date: Tue, 26 Dec 2023 16:37:21 +0100 | |
README: update dependency information for xargs for sfeed_update | |
Remove the parallel xargs example because it is now the default. | |
Diffstat: | |
M README | 58 ++---------------------------… | |
1 file changed, 2 insertions(+), 56 deletions(-) | |
--- | |
diff --git a/README b/README | |
@@ -117,6 +117,8 @@ Optional dependencies | |
used by sfeed_update(1). If the text in your RSS/Atom feeds are already UTF-8 | |
encoded then you don't need this. For a minimal iconv implementation: | |
https://git.etalabs.net/cgit/noxcuse/tree/src/iconv.c | |
+- xargs with support for the -P and -0 option, | |
+ used by sfeed_update(1). | |
- mandoc for documentation: https://mdocml.bsd.lv/ | |
- curses (typically ncurses), otherwise see minicurses.h, | |
used by sfeed_curses(1). | |
@@ -706,62 +708,6 @@ sfeedrc file and change the curl options "-L --max-redirs … | |
- - - | |
-Shellscript to update feeds in parallel more efficiently using xargs -P. | |
- | |
-It creates a queue of the feeds with its settings, then uses xargs to process | |
-them in parallel using the common, but non-POSIX -P option. This is more | |
-efficient than the more portable solution in sfeed_update which can stall a | |
-batch of $maxjobs in the queue if one item is slow. | |
- | |
-sfeed_update_xargs shellscript: | |
- | |
- #!/bin/sh | |
- # update feeds, merge with old feeds using xargs in parallel mode (non… | |
- | |
- # include script and reuse its functions, but do not start main(). | |
- SFEED_UPDATE_INCLUDE="1" . sfeed_update | |
- # load config file, sets $config. | |
- loadconfig "$1" | |
- | |
- # process a single feed. | |
- # args are: config, tmpdir, name, feedurl, basesiteurl, encoding | |
- if [ "${SFEED_UPDATE_CHILD}" = "1" ]; then | |
- sfeedtmpdir="$2" | |
- _feed "$3" "$4" "$5" "$6" | |
- exit $? | |
- fi | |
- | |
- # ...else parent mode: | |
- | |
- # feed(name, feedurl, basesiteurl, encoding) | |
- feed() { | |
- # workaround: *BSD xargs doesn't handle empty fields in the mi… | |
- name="${1:-$$}" | |
- feedurl="${2:-http://}" | |
- basesiteurl="${3:-${feedurl}}" | |
- encoding="$4" | |
- | |
- printf '%s\0%s\0%s\0%s\0%s\0%s\0' "${config}" "${sfeedtmpdir}"… | |
- "${name}" "${feedurl}" "${basesiteurl}" "${encoding}" | |
- } | |
- | |
- # fetch feeds and store in temporary directory. | |
- sfeedtmpdir="$(mktemp -d '/tmp/sfeed_XXXXXX')" || exit 1 | |
- mkdir -p "${sfeedtmpdir}/feeds" | |
- touch "${sfeedtmpdir}/ok" || exit 1 | |
- # make sure path exists. | |
- mkdir -p "${sfeedpath}" | |
- # print feeds for parallel processing with xargs. | |
- feeds | SFEED_UPDATE_CHILD="1" xargs -r -0 -P "${maxjobs}" -L 6 "$(rea… | |
- statuscode=$? | |
- # check error exit status indicator for parallel jobs. | |
- test -f "${sfeedtmpdir}/ok" || statuscode=1 | |
- # cleanup temporary files etc. | |
- cleanup | |
- exit ${statuscode} | |
- | |
-- - - | |
- | |
Shellscript to handle URLs and enclosures in parallel using xargs -P. | |
This can be used to download and process URLs for downloading podcasts, |