formulae-capitains-nemo documentation


formulae-capitains-nemo

Coverage Status

This is the class extension “NemoFormulae” for flask_nemo. A working instance of this extension for the Formulae - Litterae - Chartae Project can be found at https://werkstatt.formulae.uni-hamburg.de.

Getting Started

Further information:

  • https://github.com/capitains/tutorial-nemo

  • The app is configured via formulae/app.py

  • Complete documentation should be done using Sphinx

Running the app locally :computer::

1. Preliminary setup steps:

Only need to be executed the before the first time running the app:

  1. Clone the repositories:

    1. git clone formulae-capitains-nemo (code-base)

    2. git clone formulae-corpora (texts) (ideally in the same folder e.g., git as the code base)

  2. Create a Python virtualenv (e.g., virtualenv --python=python3 .venv)

  3. Only if needed: Set the environment variable CORPUS_FOLDERS and re-start the app.

2. Start the app:

  1. activate the virtualenv (e.g., source .venv/bin/activate)

  2. install the requirements via pip install -r requirements.txt within in the venv and from the formulae-capitains-nemo folder

  3. Optional: set-up Elastic Search via: .env

  4. For local development: ELASTICSEARCH_URL = "http://localhost:9200" -> requires: local elastic search instance

  5. If the requirements have been installed properly, you can launch python3 app.py within the venv and in formulae-capitains-nemo folder

  6. Reach the site via 127.0.0.1:5000

Run Elastic Search local

  1. Make sure that you have a few Gigabytes of RAM free

  2. cd formulae-capitains-nemo folder

  3. docker-compose up

  4. es8 exited with code 137 -> Not enough memory free

Running with Docker Compose :whale2:

The application can be started locally using Docker Compose. The setup includes:

  • Elasticsearch – search index backend

  • Redis – temporary storage for search workflows

  • formulae_corpora – helper container that clones/updates the XML corpus and can rebuild the search index

  • nemo – the Flask web application

Requirements

  • Docker

  • Docker Compose

Environment variables

Create a .env file in the project root:

  ELASTICSEARCH_URL=http://elasticsearch:9200
  FORMULAE_CORPORA_REPO_URL=<repository>
  GITHUB_TOKEN=<github-token>
  FORMULAE_CORPORA_REF=
  REBUILD_ELASTICSEARCH=false

Startup sequence

  docker compose up -d elasticsearch redis
  docker compose run --rm formulae_corpora
  docker compose up -d nemo

:computer: Application URL: http://localhost:5000

Stop

  docker compose down

How are static files handled?

  1. https://flask.palletsprojects.com/en/2.3.x/quickstart/#static-files

  2. I do recommend to add /static and /robots.txt to your nginx configuration, so that are served directly without passing through the application.

How to run the SPHINX documentation locally:

  1. Install sphinx: https://www.sphinx-doc.org/en/master/usage/installation.html

  2. activate the virtualenv (e.g., source .venv/bin/activate)

  3. install the requirements via pip install -r requirements_sphinx.txt within in the venv and from the formulae-capitains-nemo folder

  4. Build the project: sphinx-build -M html docs/source/ docs/build/ or python -m sphinx -M html docs/source/ docs/build/

  5. Open docs/build/html/index.html with your preferred browser: firefox docs/build/html/index.html

Contribution guide

  • Currently, we do not follow any specific design pattern. In the future I would to “reduce the weight” of our fat controller formulae/app.py. I have not fully decided on whether I want to have fat models or fat services instead; at the end services vs. models is more a naming thing than a real decision. Alternatively, I could do the MVC-pattern.

  • Each new collection should

Run GitHub-actions locally:

  1. Install GitHub CLI

  2. Install act: gh extension install https://github.com/nektos/gh-act

  3. cd git/formulae-capitains-nemo

  4. gh act -W '.github/workflows/python-app.yml' or gh act -W '.github/workflows/documentation.yml'

  5. Comment out the redis port (gh seems to bring its own redis instance)

Contents: