All notable changes to this project will be documented in this file.
The format is based on Keep a Changelog, and this project adheres to Semantic Versioning.
Changes are grouped as follows
Added
for new features.Changed
for changes in existing functionality.Deprecated
for soon-to-be removed features.Removed
for now removed features.Fixed
for any bug fixes.Security
in case of vulnerabilities.
- Support for replicating file data
- Obsolete datapoints insert API usage fixed.
- Obsolete datapoints API usage fixed.
- python upgrade to ^3.11
- pyyaml upgrade to ^6.0.1
- cognite-sdk-upgrade to ^7.13.8
- naming convention fixed
- typo fix
- Adding replication for relationships
- Bug fix on dataset id
- Fixed replication for sequences
- Bug fix on deletion of objects not in source or destination
- Bug fix on parameters
- Updated python sdk version
- Bug fix
- Datapoints copy update
- Updated python sdk version
- README update
- Move to major version and bug fixes
- Notification and error handling for identical timeseries entries in the config file #126
- Possibility to ignore replication of some fields, like metadata #143
- Support for replicating Sequences
- CI/CD from Jenkins to github actions
- Fix broken behavior: add arg to copy functions of events and files replicators (code was broken as a result of the addition of a 7th arg to the invoking replication.thread function)
- Add filtering by external id that existed in time series to events and files replicators
- Add exclude pattern by regex that existed in time series to events and files replicators
- Typo fix: value_manipluation_lambda_fnc parameter (optional) renamed as value_manipulation_lambda_fnc
- value_manipluation_lambda_fnc parameter (optional) added to datapoints.replicate function. A lambda function string can be provided. This function takes each datapoint.value and gets applied to each point of a given timeseries. Example: "lambda x: x*15"
- Does not fail if timeseries does not exist
- Support copy to existing timeseries that were not copied by the replicator
- Support configurable base url
- Default value for getting fetching datapoints 0 -> 31536000000, due to api restrictions
- Option for batch size specification with respect to datapoint replication
- Cleanup
- Config parameter "timeseries_exclude_pattern" will now have effect on datapoints, opposed to only time series.
- The replicator will raise an exception if timeseries_exclude_pattern and timeseries_external_ids is given.
- Provide config file path with env var COGNITE_CONFIG_FILE
- Use version for tag on images published to docker hub
- Replicate datapoints with a consistent time series order
- Switch time series regex filter to external id instead of name field
- Add ability to replicate time series by list of external ids
- Add ability to replicate unlinkable (asset-less) time series
- Add ability to replicate datapoints in a specific [start, end) range
- Added missing yaml dependency
- Running the package as a script now uses a yaml file for configuration, rather than command-line args.
-
Datapoint replication now provides
src_datapoint_transform
parameter to allow for transformations of source datapoints (e.g. adjust value, adjust timestamp) -
Datapoint replication now provides
timerange_transform
parameter to allow replication of arbitrary time ranges
- Replication method
clear_replication_metadata
to remove the metadata added during replication
- Amount of data points pulled into memory now limited by default
- Support for file metadata replication
- Push to docker hub within Jenkins
- Logging for datapoint replication simplified
- batch_size parameter for datapoints now consistent with other resource types
- Determines number of batches/jobs to do based on num_batches, rather than only on num_threads
- Handle exceptions in datapoint replication
- Ability to fetch datapoints for a list of timeseries specified by external ids
- Time series overlap checks between destination and source much faster
- Boundary cases of datapoint replication are handled properly - no duplicates at start time, and no exclusion of final datapoint
- Support for asset replication by subtree
- Support for restricting event and time series replication to events/time series with replicated assets
- Events without assetIds can now be replicated as expected
- Dockerfile to build docker image of the replicator
- Command line arguments for running the replicators. Try
poetry run replicator
- Time series replication no longer attempts to create security category-protected time series
- Use pre-commit hooks to run black and unit tests
- Send logs to google cloud stackdriver if configured, needed the right dependencies