All notable changes to this project will be documented in this file.
The format is based on Keep a Changelog, and this project adheres to Semantic Versioning.
- Definition spec field to specify a class representative of key value pairs for definitions with primitives which are dictionaries
- Auto generation of documentation for operation implementations, models, and sources. Generated docs include information on configuration options and inputs and outputs for operation implementations.
- Async helpers got an
aenter_stackmethod which creates and returns andcontextlib.AsyncExitStackafter entering all the context's passed to it. - Example of how to use Data Flow Facilitator / Orchestrator / Operations by writing a Python meta static analysis tool, shouldi
- OperationImplementation add_label and add_orig_label methods now use op.name instead of ENTRY_POINT_ORIG_LABEL and ENTRY_POINT_NAME.
- Make output specs and remap arguments optional for Operations CLI commands.
- Feature skeleton project is now operations skeleton project
- MemoryOperationImplementationNetwork instantiates OperationImplementations
using their
withconfig()method. - MemorySource now decorated with
entry_point - MemorySource takes arguments correctly via
config_setandconfig_get - skel modules have
long_description_content_typeset to "text/markdown" - Base Orchestrator
__aenter__and__aexit__methods were moved to the Memory Orchestrator because they are specific to that config. - Async helper
aenter_stackusesinspect.isfunctionso it will bind lambdas
- Support for zip file source
- Async helper for running tasks concurrently
- Gitter badge to README
- Documentation on the Data Flow Facilitator subsystem
- codesec plugin containing operations which gather security related metrics on code and binaries.
- auth plugin containing an scrypt operation as an example of thread pool usage.
- Standardized the API for most classes in DFFML via inheritance from dffml.base
- Configuration of classes is now done via the args() and config() methods
- Documentation is now generated using Sphinx
- Corrected maxsplit in util.cli.parser
- Check that dtype is a class in Tensorlfow DNN
- CI script no longer always exits 0 for plugin tests
- Corrected render type in setup.py to markdown
- Contribution guidelines
- Logging documentation
- Example usage of Git features
- New Model and Feature creation script
- New Feature skeleton directory
- New Model skeleton directory
- New Feature creation tutorial
- New Model creation tutorial
- Added update functionality to the CSV source
- Added support for Gzip file source
- Added support for bz2 file source
- Travis checks for additions to CHANGELOG.md
- Travis checks for trailing whitespace
- Added support for lzma file source
- Added support for xz file source
- Added Data Flow Facilitator
- Restructured documentation to docs folder and moved from rST to markdown
- Git feature cloc logs if no binaries are in path
- Enable source.file to read from /dev/fd/XX
- Corrected formatting in README for PyPi
- Feature class to collect a feature in a dataset
- Git features to collect feature data from Git repos
- Model class to wrap implementations of machine learning models
- Tensorflow DNN model for generic usage of the DNN estimator
- CLI interface and framework
- Source class to manage dataset storage