[ << Build system notes ] | [Top][Contents][Index][ ? ] | [ Modifying the Emmentaler font >> ] | ||
[ < Building a bibliography ] | [ Up : Build system notes ] | [ Modifying the Emmentaler font > ] |
12.5 Website build
Note: This information applies only to the standard make
website
from the normal build directory. The process is
different for dev/website-build
.
The rule for make website is found in GNUmakefile.in:
website: $(MAKE) config_make=$(config_make) \ top-src-dir=$(top-src-dir) \ -f $(top-src-dir)/make/website.make \ website
This translates as:
make --no-builtin-rules config_make=./config.make \ top-src-dir=/home/phil/lilypond-git \ -f /home/phil/lilypond-git/make/website.make \ website
which has the effect of setting the variables config_make
and top-src-dir
and then processing the file
git/make/website.make
with the target of website.
website.make
starts with the following:
ifeq ($(WEBSITE_ONLY_BUILD),1)
which checks to see whether the variable WEBSITE_ONLY_BUILD
was set to one on the command line. This is only done for
standalone website builds, not in the normal case. The result of
the test determines the value of some variables that are set. A
number of other variables are set, in order to establish locations
of various files. An example is:
CREATE_VERSION=python $(script-dir)/create-version-itexi.py
The rule for website is:
website: website-texinfo website-css website-pictures website-examples web-post cp $(SERVER_FILES)/favicon.ico $(OUT)/website cp $(SERVER_FILES)/robots.txt $(OUT)/website cp $(top-htaccess) $(OUT)/.htaccess cp $(dir-htaccess) $(OUT)/website/.htaccess |
so we see that this starts by running the rules for 5 other
targets, then finishes by copying some files. We’ll cover that
later - first website-texinfo
. That rule is:
website-texinfo: website-version website-xrefs website-bibs for l in '' $(WEB_LANGS); do \ if test -n "$$l"; then \ langopt=--lang="$$l"; \ langsuf=.$$l; \ fi; \ $(TEXI2HTML) --prefix=index \ --split=section \ --I=$(top-src-dir)/Documentation/"$$l" \ --I=$(top-src-dir)/Documentation \ --I=$(OUT) \ $$langopt \ --init-file=$(texi2html-init-file) \ -D web_version \ --output=$(OUT)/"$$l" \ $(top-src-dir)/Documentation/"$$l"/web.texi ; \ ls $(OUT)/$$l/*.html | xargs grep -L \ 'UNTRANSLATED NODE: IGNORE ME' | \ sed 's!$(OUT)/'$$l'/!!g' | xargs \ $(MASS_LINK) --prepend-suffix="$$langsuf" \ hard $(OUT)/$$l/ $(OUT)/website/ ; \ done
which therefore depends on website-version
,
website-xrefs
and website-bibs
.
website-version: mkdir -p $(OUT) $(CREATE_VERSION) $(top-src-dir) > $(OUT)/version.itexi $(CREATE_WEBLINKS) $(top-src-dir) > $(OUT)/weblinks.itexi
which translates as:
mkdir -p out-website python /home/phil/lilypond-git/scripts/build/create-version-itexi.py /home/phil/lilypond-git > out-website/version.itexi python /home/phil/lilypond-git/scripts/build/create-weblinks-itexi.py /home/phil/lilypond-git > out-website/weblinks.itexi
So, we make out-website then send the output of
create-version-itexi.py
to out-website/version.itexi
and create-weblinks-itexi.py
to
out-website/weblinks.itexi
.
create-version-itexi.py
parses the file VERSION
in
the top source dir. It contains:
PACKAGE_NAME=LilyPond MAJOR_VERSION=2 MINOR_VERSION=15 PATCH_LEVEL=13 MY_PATCH_LEVEL= VERSION_STABLE=2.14.2 VERSION_DEVEL=2.15.12
currently. c-v-i.py
parses this to:
@c ************************ Version numbers ************ @macro version 2.15.13 @end macro @macro versionStable 2.14.2 @end macro @macro versionDevel 2.15.12 @end macro
create-weblinks-itexi.py
creates a load of texi macros (of
the order of 1000) similar to:
@macro manualStableGlossaryPdf @uref{../doc/v2.14/Documentation/music-glossary.pdf,Music glossary.pdf} @end macro.
It loads its languages from langdefs.py, and therefore outputs the following unhelpful warning:
langdefs.py: warning: lilypond-doc gettext domain not found.
Next:
website-xrefs: website-version for l in '' $(WEB_LANGS); do \
is the start of the rule, truncated for brevity. This loops through the languages to be used on the website, processing some variables which I don’t fully understand, to run this command:
python /home/phil/lilypond-git/scripts/build/extract_texi_filenames.py \ -I /home/phil/lilypond-git/Documentation \ -I /home/phil/lilypond-git/Documentation/"$l" \ -I out-website -o out-website --split=node \ --known-missing-files= \ /home/phil/lilypond-git/scripts/build/website-known-missing-files.txt \ -q \ /home/phil/lilypond-git/Documentation/"$l"/web.texi ;\ |
There’s a good description of what
extract_texi_filenames.py
does at the top of the script,
but a shortened version is:
If this script is run on a file texifile.texi, it produces
a file texifile[.LANG].xref-map with tab-separated entries
of the form NODE\tFILENAME\tANCHOR.
An example from
web.nl.xref-map
is:
Inleiding Introduction Introduction
e-t-f.py
follows the includes from document to document.
We know some have not been created yet, and
known-missing-files
option tells e-t-f.py
which
these are.
It then does this:
for m in $(MANUALS); do \
to run e-t-f.py
against all of the manuals, in each
language. Next:
website-bibs: website-version BSTINPUTS=$(top-src-dir)/Documentation/web \ $(WEB_BIBS) -s web \ -s $(top-src-dir)/Documentation/lily-bib \ -o $(OUT)/others-did.itexi \ $(quiet-flag) \ $(top-src-dir)/Documentation/web/others-did.bib
This is half the command. It runs bib2texi.py
on 2
.bib
files - others-did.bib
and we-wrote.bib
.
This converts bibliography files into texi files with
bibtex
.
Next the commands in the website-texinfo
rule are run:
for l in '' $(WEB_LANGS); do \
run texi2html
. This is the program that outputs the
progress message (found in
Documentation/lilypond-texi2html.init
):
Processing web site: []
It also outputs warning messages like:
WARNING: Unable to find node 'Řešení potíží' in book usage.
website-css: cp $(top-src-dir)/Documentation/css/*.css $(OUT)/website
Copies 3 css files to out-website/website. Then:
website-pictures: mkdir -p $(OUT)/website/pictures if [ -d $(PICTURES) ]; \ then \ cp $(PICTURES)/* $(OUT)/website/pictures ; \ ln -sf website/pictures $(OUT)/pictures ;\ fi
which translates as:
if [ -d Documentation/pictures/out-www ]; \ then \ cp Documentation/pictures/out-www/* out-website/website/pictures ; \ ln -sf website/pictures out-website/pictures ;\ fi |
i.e. it copies the contents of
build/Documentation/pictures/out-www/*
to
out-website/website/pictures
. Unfortunately, the pictures
are only created once make doc
has been run, so an initial
run of make website
copies nothing, and the pictures on the
website (e.g. the logo) do not exist. Next:
website-examples: mkdir -p $(OUT)/website/ly-examples if [ -d $(EXAMPLES) ]; \ then \ cp $(EXAMPLES)/* $(OUT)/website/ly-examples ; \ fi
translates to:
mkdir -p out-website/website/ly-examples if [ -d Documentation/web/ly-examples/out-www ]; \ then \ cp Documentation/web/ly-examples/out-www/* out-website/website/ly-examples ; \ fi |
This does the same with the LilyPond examples (found at
http://lilypond.org/examples.html). Again, these are
actually only created by make doc
(and since they are
generated from LilyPond source files, require a working LilyPond
exe
made with make
). So this does nothing
initially. Then:
web-post: $(WEB_POST) $(OUT)/website
which is:
python /home/phil/lilypond-git/scripts/build/website_post.py out-website/website |
which describes itself as:
This is web_post.py. This script deals with translations
in the "make website" target.
It also does a number of other things, including adding the Google tracker code and the language selection footer. We’re now at the end of our story. The final 4 lines of the recipe for website are:
cp $(SERVER_FILES)/favicon.ico $(OUT)/website cp $(SERVER_FILES)/robots.txt $(OUT)/website cp $(top-htaccess) $(OUT)/.htaccess cp $(dir-htaccess) $(OUT)/website/.htaccess
The first translates as:
cp /home/phil/lilypond-git/Documentation/web/server/favicon.ico out-website/website |
so we see these are just copying the support files for the web server.
website.make summary
Recipes in ‘website.make’:
-
website:
this is the "master" rule. It calls the other rules in order, then copies some extra files around - see below for further of the process it produces. -
website-version
: this calls the python scripts below:-
This writes a @version, @versionStable, and @versionDevel based on the top-level VERSIONS file, toscripts/build/create-version-itexi.py
out-website/version.itexi
-
This creates a ton of macros inscripts/build/create-weblinks-itexi.py
out-website/weblinks.itexi
. Stuff like @downloadStableLinuxNormal, @downloadStableWidows,@stableDocsNotationPdf{}
, @downloadDevelSourch-zh.It’s quite monstrous because it deals with combinations of stable/devel, source/docs, lang/lang/lang*10, etc.
-
-
website-xrefs:
creates files used for complicated "out-of-build" references toout-website/*.xref-map
If you just write @ref{}, then all’s groovy and we wouldn’t need this. But if you write @rlearning{}, then our custom texi2html init file needs to know about our custom xref file format, which tells our custom texi2html init file how to create the link.
GP: we should have a separate @node to discuss xrefs. Also, take a quick look at a generated xref file – it’s basically just a list of @node’s [sic teenager pluralization rule] from the file.
-
website-bib:
generates the bibliography texinfo files from the .bib files - in the case of the website build these are ‘others-did.bib’ and ‘we-wrote.bib’. -
website-texinfo:
this is the main part; it calles texi2html to generate the actual html. It also has a ton of options to texi2html to pass info to our custom init file.The file actually built is called ‘web.texi’, and is either in the ‘Documentation’ directory, or a sub-directory specific to the language.
The options file is ‘/Documentation/lilypond-texi2html.init’. This contains *lots* of option and configuration stuff, and also includes the line:
print STDERR "Initializing settings for web site: [$Texi2HTML::THISDOC{current_lang}]\n";
This is where one of the console messages is generated.
We have somewhere between 2-4 different ways "to pass info to our custom init file". This is highly Not Good (tm), but that’s how things work at the moment.
After texi2html, it does some black magick to deal with untranslated nodes in the translations. Despite writing that part, I can’t remember how it works. But in theory, you could figure it out by copy&pasting each part of the command (by "part", I mean "stuff before each | pipe"), substituting the variables, then looking at the text that’s output. For example,
ls $(OUT)/$$l/*.html
is going to print a list of all html files, in all languages, in the build directory. Then more stuff happens to each of those files (that’s what xargs does).
-
website-css:
just copies files to the build dir. -
website-pictures, website-examples:
more file copies, with an if statement to handle if you don’t have any generated pictures/examples. -
web-post:
runs:scripts/build/website_post.py
which, it adds the "this page is translated in klingon" to the bottom of html pages, and adds the google analytics javascript. It also has hard-coded lilypond version numbers, which is Bad (tm).
Here’s a summary of what gets called, in what order, when we run
make website
website: website-texinfo: website-version: creates version.itexi and weblinks.itexi website-xrefs: runs extract_texi_filenames.py website-bibs: creates bibliography files, described above website-css: copies css files website-pictures: copies pictures website-examples: copies examples web-post: runs website_post.py Then some file copying
[ << Build system notes ] | [Top][Contents][Index][ ? ] | [ Modifying the Emmentaler font >> ] | ||
[ < Building a bibliography ] | [ Up : Build system notes ] | [ Modifying the Emmentaler font > ] |