-
Notifications
You must be signed in to change notification settings - Fork 144
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Defaults not generated for object whose properties have defaults defined #206
Comments
Default values are not recursively evaluated it seems. What do you think @martin-koch ? |
@ChristianReese I tried your code and see the same thing. One solution could be to automatically create a tree of default-values and as such create nested default-values, but then the patch might be huge for big schemas with a lot of non-required keywords. Creating the whole default-value-tree with all literal leaves and then copy the branches to create the patch would be a solution, but would consume a lot of memory. This is maybe why we didn't do it and stuck to one-level-default-value, ie. not going recursive to extract the leaves. |
The problem is that "dimensions" itself has no default definition. If it does not occur in the provided json document it will not be implicitly created. @ChristianReese if you try to validate I think it reasonable to expect the output in the provided example. On the other hand it may also be reasonable that the "dimension" object is not created in this case. So neither the one nor the other behavior is "wrong". |
I wasn't remembering this question, it was when I wrote the first version of the validator. Back to the question, if OP found this problem to be a bug or unexpected behavior then there is problem. At least in the documentation, or worst in the code. I think @ChristianReese wants to have nested instances to be filled up/down to their leaves even if they don't have a default value themselves. Are these the only 2 strategies we could use for default-value patches? One-level, or fully-nested? |
Thinking about what @martin-koch said, it makes sense that if no defaults are specified for the dimensions object itself then the default is assumed to be no dimensions object at all. Otherwise it would be assuming that it should exist by default even if that was never specified in the schema. The defaults of the inner properties assume that a dimensions object is created to begin with. This leads me to the question of how to define the default for the dimensions object. As suggested, I can specify EDIT: Forgot to mention that, as @martin-koch said, I could also validate |
After a bit of research I discovered this requirement: https://datatracker.ietf.org/doc/html/draft-handrews-json-schema-validation-01#section-10.2 So apparently there aren't any rules for what "default" can be set to - it's implementation specific. I haven't found any settled-upon conventions for a value to set this to to populate defaults for all properties, other than spelling them out as already described. I'm beginning to think that the best way for bringing this knowledge into the schema without redundancy is to have the schema define the defaults at the object level instead of the property level. Or perhaps an even better path with this library specifically is to pass in the objects you want to exist into the object being validated, and then have the defaults fill in. The latter approach is likely the one I'll take for now. So maybe this will just take the form of some new/clarified documentation. |
@ChristianReese json_validator validator{rectangle_schema};
/* validate empty json -> will be expanded by the default values defined in the schema */
json rectangle = "{}"_json;
while (true) {
const auto default_patch = validator.validate(rectangle);
if (default_patch.empty()) {
break;
}
rectangle = rectangle.patch(default_patch);
};
std::cout << rectangle.dump() << std::endl; // {"dimensions":{"height":10,"width":20}} I know this is not ideal in terms of runtime, but at least it should do the task. |
@martin-koch That's a really good idea. Because the whole design pattern of this is to take an empty object and populate it with defaults, so we're just making that recursive in that empty sub-objects are generated as defaults and fed back in to populate their sub-property defaults. It sounds like the resolution for this issue, then, is probably just documenting that approach as a solution to getting a deep-default, or potentially having an option that runs that recursive routine within the library. I don't feel strongly, up to you guys. EDIT: Thinking of it more, I might actually be inclined to not put this into the library since that way the library focuses on the more atomic operation of generating defaults and probably doesn't need to be any higher abstraction-level than that. As long as this use case is documented it should be good. Again, up to you guys. |
I was on that topic as well. Especially when I added #244. Saying that, it may needs a different kind of library / software-component witch is more a json-preparer. The reason for this, is that when traveling through the json schema and the json data, the validator needs to decides to go different paths. @pboettch did a nice work to model a the schema as hierarchy of single schema instances that do their local testing, but they are not fully aware of hypothetical values that might be present if the whole thing would take default values into account, as well. This is always a thing, when there are conditional paths with Next issue is, that if a path is |
The "Default value processing" example snippet in the README gives me the expected output of {"height":10,"width":20}. However, when I modify the schema to put the properties in a "dimensions" object, all of a sudden I get the following output:
Validation of schema failed: [json.exception.parse_error.104] parse error: JSON patch must be an array of objects
When I would expect the output to be:
{"dimensions":{"height":10,"width":20}}
Here's the code that produces the error:
I would expect that the defaults could still be generated even if I nested the properties in several more layers of objects.
JSON Validator commit used: c6cb3d4
NLohmann version used: 3.10.5
The text was updated successfully, but these errors were encountered: