Component that one-way syncs data from LDAP to a PostgreSQL DB in the deewee space. This allows for reporting further downstream.
The app is designed to sync changed (new/updated) LDAP entries to the PostgreSQL DB based on the modifyTimestamp attribute from LDAP. The identifying key between the two systems is the LDAP attribute EntryUUID.
Do note that the service is not able to handle deletes. A full load from LDAP can be achieved if the target database table is empty. So in case of known deleted LDAP entries that should reflect in the database, this mechanism can be used to "sync" up.
- Python >= 3.7 (when working locally)
- The package
python3-venv(when working locally) - The package
PostgreSQL(when running integration tests locally) - LDAP Server with appropriate structure
- PostgreSQL DB with appropriate schema
- Docker (optional)
- Access to the meemoo PyPi
viaa-chassisldap3- communicates with the LDAP serverpsycopg2- communicates with the PostgreSQL server
pytest- runs automated testspytest-cov- produces coverage reportstesting.postgresql- creates temporary PostgreSQL databases and cleans up afterwards
First clone this repository with git clone and change into the new directory.
If you have no access to an LDAP and/or PostgreSQL server they can easily be set up internally by using Docker containers, explained in section external servers via containers.
Running the app locally is recommended when developing/debugging and running tests.
First create the virtual environment:
python3 -m venv ./venvBe sure to activate said environment:
source ./venv/bin/activateNow we can install the required packages:
pip3 install -r requirements.txtCopy the config.yml.example file:
cp ./config.yml.example ./config.ymlBe sure to fill in the correct configuration parameters in config.yml in order to communicate with the LDAP server and PostgreSQL database.
Run the app:
python3 -m app.appTo run the tests, first install the testing dependencies:
pip3 install -r requirements-test.txtRun the tests:
python3 -m pytestThe deewee integration tests make use of the PostgreSQL command initdb to create a temporary database. Thus, in order the run those integration tests you need to have PostgreSQL installed.
If desired, you can also run coverage reports:
python3 -m pytest "--cov=app.app" "--cov=app.comm"If you have no access to an LDAP and/or PostgreSQL server they can easily be set up internally by using Docker containers.
A container which runs an LDAP server with an empty structure can be found here. The custom objectClasses and attributeTypes can be found here. The latter repository also contains a sample.ldif file with some orgs and persons.
A Dockerfile can be found in ./additional_containers/postgresql. The schema is defined in the init.sql file of said folder.
Create an env file and fill in the parameters:
cp ./additional_containers/postgresql/env.example ./additional_containers/postgresql/envAfterwards build and run the container:
docker build -f ./additional_containers/postgresql/Dockerfile -t deewee_postgresql .
docker run --name deewee_postgresql -p 5432:5432 --env-file=./additional_containers/postgresql/env -d deewee_postgresqlIf desired, test the connection with the following statement in terminal:
psql -h localhost -U {postgresql_user} -d {postgresql_database}If you just want to execute the app it might be easier to run the container instead of installing python and dependencies.
Copy the config file (cp ./config.yml.example ./config.yml) and fill in the correct parameters:
If the app connects to external servers just build the image:
docker build -t ldap2deewee .If the build succeeded, you should be able to find the image via:
docker image lsTo run the app in the container, just execute:
docker run --name ldap2deewee -d ldap2deeweeIf you want to run the tests in the container, you can run the container in interactive mode:
docker run -it --rm --entrypoint=/bin/sh ldap2deeweeYou can then run the tests as described in section testing.
However, if you make use of the other (LDAP and PostgreSQL) containers as described above, the app container needs to able to communicate with them. In order to do so you can connect them via a Docker Network. We can do this manually by creating a Docker Network and connecting them. Another more automatic option is via Docker Compose.
We'll need to create a network and connect the containers together:
docker network create ldap2deewee_network
docker network connect ldap2deewee_network deewee_postgresql
docker network connect ldap2deewee_network archief_ldapNow we just need to build the ldap2deewee container and run in the Docker Network. Make sure that the host parameters in the config.yml file actually point to within the Docker Network.
docker build -t ldap2deewee .
docker run --name ldap2deewee --network=ldap2deewee_network -d ldap2deeweeCheck the status via docker container ls -a and/or check the logs via docker logs ldap2deewee.
Instead of manually creating a network and coupling the containers together, Docker Compose can be used as a more automatic alternative.
Do note that how the docker-compose file is set up, the process expects the PostgreSQL and LDAP images to have been build locally. However, as the Docker Compose process will also run the containers, make sure that they do not exist yet.
If you haven't already, be sure to create an env file for OpenLDAP and PostgreSQL containers as the Docker Compose process depends on such files. See the respective env.example files for the desired structure.
docker-compose up -dCheck the status via docker container ls -a and/or check the logs via docker logs ldap2deewee