You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+23-15Lines changed: 23 additions & 15 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -2,39 +2,47 @@
2
2
3
3
The Data Validation Engine (DVE) is a configuration driven data validation library built and utilised by NHS England.
4
4
5
-
As mentioned above, the DVE is "configuration driven" which means the majority of development for you as a user will be building a JSON document to describe how the data will be validated. The JSON document is also typically known as a `dischema` file and example files can be accessed [here](./tests/testdata/). If you'd like to learn more about JSON document and how to build one from scratch, then please read the documentation [here](./docs/).
5
+
As mentioned above, the DVE is "configuration driven" which means the majority of development for you as a user will be building a JSON document to describe how the data will be validated. The JSON document is known as a `dischema` file and example files can be accessed [here](./tests/testdata/). If you'd like to learn more about JSON document and how to build one from scratch, then please read the documentation [here](./docs/).
6
6
7
-
Once a dischema file has been defined, you are ready to use the DVE. The DVE is typically orchestrated based on the four key "services". These are...
7
+
Once a dischema file has been defined, you are ready to use the DVE. The DVE is typically orchestrated based on four key "services". These are...
8
8
9
-
|Order| Service | Purpose |
10
-
| -----| ------- | ------- |
11
-
| 1. | File Transformation | This service will take ingest submitted files and turn them into stringified parquet files to ensure that a consistent data structure can be passed through the DVE. |
12
-
| 2. | Data Contract | This service will validate and cast a stringified parquet submission against a [pyantic model](https://docs.pydantic.dev/latest/). |
9
+
|| Service | Purpose |
10
+
| -- | ------- | ------- |
11
+
| 1. | File Transformation | This service will take submitted files and turn them into stringified parquet file(s) to ensure that a consistent data structure can be passed through the other services. |
12
+
| 2. | Data Contract | This service will validate and peform type casting against a stringified parquet file using [pydantic models](https://docs.pydantic.dev/1.10/). |
13
13
| 3. | Business Rules | The business rules service will perform more complex validations such as comparisons between fields and tables, aggregations, filters etc to generate new entities. |
14
-
| 4. | Error Reports | The error reports service will take all the errors raised in previous services and surface them into a readable format for a downstream users/service. Currently, this implemented to be an excel spreadsheet but could be reconfigure to meet other requirements/use cases. |
14
+
| 4. | Error Reports | The error reports service will take all the errors raised in previous services and surface them into a readable format for a downstream users/service. Currently, this implemented to be an excel spreadsheet but could be reconfigured to meet other requirements/use cases. |
15
15
16
-
We have more detailed documentation around these services [here](./docs/).
16
+
If you'd like more detailed documentation around these services the please read the extended documentation [here](./docs/).
17
+
18
+
The DVE has been designed in a way that's modular and can support users who just want to utilise specific "services" from the DVE (i.e. just the file transformation + data contract). Additionally, the DVE is designed to support different backend implementations. As part of the base installation of DVE, you will find backend support for `Spark` and `DuckDB`. So, if you need a `MySQL` backend implementation, you can implement this yourself. Given our organisations requirements, it will be unlikely that we add anymore specific backend implementations into the base package beyond Spark and DuckDB. So, if you are unable to implement this yourself, I would recommend reading the guidance on [requesting new features and raising bug reports here](#requesting-new-features-and-raising-bug-reports).
19
+
20
+
Additionally, if you'd like to contribute a new backend implementation into the base DVE package, then please look at the [Contributing][#Contributing] section.
17
21
18
22
## Installation and usage
19
23
20
-
The DVE is a Python package and can be installed using `pip`. As of release (version 1+) only supports Python 3.7, with Spark version 3.2.1 and DuckDB version of 1.1.0. We are currently working on upgrading the DVE to work on Python 3.11+ and this will be made available asap with version 2.0.0 release.
24
+
The DVE is a Python package and can be installed using `pip`. As of release v1.0.0 we currently only supports Python 3.7, with Spark version 3.2.1 and DuckDB version of 1.1.0. We are currently working on upgrading the DVE to work on Python 3.11+ and this will be made available asap with version 2.0.0 release.
21
25
22
-
In addition to a working Python 3.7+ installation you will need OpenJDK 11 installed.
26
+
In addition to a working Python 3.7+ installation you will need OpenJDK 11 installed if you're planning to use the Spark backend implementation.
23
27
24
28
Python dependencies are listed in `pyproject.toml`.
25
29
26
30
To install the DVE package you can simply install using a package manager such as [pip](https://pypi.org/project/pip/).
Once you have installed the DVE you are ready to use it. For guidance on how to create your dischema json document (configuration), please read the [documentation](/docs/).
36
+
Once you have installed the DVE you are ready to use it. For guidance on how to create your dischema json document (configuration), please read the [documentation](./docs/).
33
37
34
-
The long term aim is to make the DVE available via PyPi and Conda but we are not quite there yet. Once available this documentation will be updated to reflect the new installation options.
38
+
Please note - The long term aim is to make the DVE available via PyPi and Conda but we are not quite there yet. Once available this documentation will be updated to contain the new installation options.
35
39
36
40
## Requesting new features and raising bug reports
37
-
If you have spotted a bug with the DVE then please raise an issue [here](https://github.com/nhsengland/Data-Validation-Engine/issues). Same for any feature requests.
41
+
**Before creating new issues, please check to see if the same bug/feature has been created already. Where a duplicate is created, the ticket will be closed and referenced to an existing issue.**
42
+
43
+
If you have spotted a bug with the DVE then please raise an issue [here](https://github.com/nhsengland/Data-Validation-Engine/issues) using the "bug template".
44
+
45
+
If you have feature request then please follow the same process whilst using the "Feature request template".
38
46
39
47
## Upcoming features
40
48
Below is a list of features that we would like to implement or have been requested.
@@ -45,7 +53,7 @@ Below is a list of features that we would like to implement or have been request
45
53
| Upgrade to Pydantic 2.0 | Not yet confirmed | No |
46
54
| Create a more user friendly interface for building and modifying dischema files | Not yet confirmed | No |
47
55
48
-
Beyond the Python upgrade, we cannot confirm the other features will be made available any time soon. Therefore, if you have the interest and desire to make these features available, then please feel free to read the [contributing section](#contributing) and get involved.
56
+
Beyond the Python upgrade, we cannot confirm the other features will be made available anytime soon. Therefore, if you have the interest and desire to make these features available, then please read the [Contributing](#contributing) section and get involved.
0 commit comments