Description
Hi, we encountered a problem (originates from 3.1.5 > 3.2 update), which occurrs when a property (of some DBO entity) of type org.bson.Document is handled within MappingMongoConverter.ConversionContext#convert. MongoDB driver correctly returns a document, of which all inner object properties (fields) are of type org.bson.Document.
Then MappingMongoConverter comes into play (within process of populating fields , #populateProperties -> #readProperties). During that process, the property value of type org.bson.Document is incorrectly handled as a Map and all properties (fields) with an object value are converted to a LinkedHashMap. Later on, when we, within concrete context, try to convert that object of type org.bson.Document to certain class (using MongoTemplate#getConverter()#read(getEntityClass(), sourceObject), but it is not important), it leads to:
org.springframework.core.convert.ConverterNotFoundException: No converter found capable of converting from type [java.util.LinkedHashMap<?, ?>] to type [...]
That error is only consequence of the way how the object of property of type org.bson.Document is treated. Yes, org.bson.Document is implementing a Map, but it should not be processed as a Map.
If we compare codes of version 3.1.5 and version 3.2.X
First, let's look at version 3.1.5:
MappingMongoConverter#readValue
@Nullable
@SuppressWarnings("unchecked")
<T> T readValue(Object value, TypeInformation<?> type, ObjectPath path) {
Class<?> rawType = type.getType();
if (conversions.hasCustomReadTarget(value.getClass(), rawType)) {
return (T) conversionService.convert(value, rawType);
} else if (value instanceof DBRef) {
return potentiallyReadOrResolveDbRef((DBRef) value, type, path, rawType);
} else if (value instanceof List) {
return (T) readCollectionOrArray(type, (List<Object>) value, path);
} else if (value instanceof Document) {
return (T) read(type, (Document) value, path); // this "if branch" comes into play if a property of type org.bson.Document is read.
} else if (value instanceof DBObject) {
return (T) read(type, (BasicDBObject) value, path);
} else {
return (T) getPotentiallyConvertedSimpleRead(value, rawType);
}
}
Then MappingMongoConverter#read is called and the "if (Document.class.isAssignableFrom(rawType))" is chosen as a processing branch).
@Nullable
@SuppressWarnings("unchecked")
private <S extends Object> S read(TypeInformation<S> type, Bson bson, ObjectPath path) {
Assert.notNull(bson, "Bson must not be null!");
TypeInformation<? extends S> typeToUse = typeMapper.readType(bson, type);
Class<? extends S> rawType = typeToUse.getType();
if (conversions.hasCustomReadTarget(bson.getClass(), rawType)) {
return conversionService.convert(bson, rawType);
}
if (Document.class.isAssignableFrom(rawType)) { // this ensures that a property of type org.bson.Document is not handled at all, which is correct
return (S) bson;
}
if (DBObject.class.isAssignableFrom(rawType)) { // I think that this "if branch" should be also applied within MappingMongoConverter.ConversionContext#convert
if (bson instanceof DBObject) {
return (S) bson;
}
if (bson instanceof Document) {
return (S) new BasicDBObject((Document) bson);
}
return (S) bson;
}
if (typeToUse.isCollectionLike() && bson instanceof List) {
return (S) readCollectionOrArray(typeToUse, (List<?>) bson, path);
}
if (typeToUse.isMap()) {
return (S) readMap(typeToUse, bson, path);
}
if (bson instanceof Collection) {
throw new MappingException(String.format(INCOMPATIBLE_TYPES, bson, BasicDBList.class, typeToUse.getType(), path));
}
if (typeToUse.equals(ClassTypeInformation.OBJECT)) {
return (S) bson;
}
// Retrieve persistent entity info
Document target = bson instanceof BasicDBObject ? new Document((BasicDBObject) bson) : (Document) bson;
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(typeToUse);
if (entity == null) {
if (codecRegistryProvider != null) {
Optional<? extends Codec<? extends S>> codec = codecRegistryProvider.getCodecFor(rawType);
if (codec.isPresent()) {
return codec.get().decode(new JsonReader(target.toJson()), DecoderContext.builder().build());
}
}
throw new MappingException(String.format(INVALID_TYPE_TO_READ, target, typeToUse.getType()));
}
return read((MongoPersistentEntity<S>) entity, target, path);
}
Now, let's look at MappingMongoConverter.ConversionContext#convert of version 3.2.X, more precisely 3.2.2:
public <S extends Object> S convert(Object source, TypeInformation<? extends S> typeHint) {
Assert.notNull(typeHint, "TypeInformation must not be null");
if (conversions.hasCustomReadTarget(source.getClass(), typeHint.getType())) {
return (S) elementConverter.convert(source, typeHint);
}
// here is no check whether a document should be handled at all.
if (source instanceof Collection) {
Class<?> rawType = typeHint.getType();
if (!Object.class.equals(rawType)) {
if (!rawType.isArray() && !ClassUtils.isAssignable(Iterable.class, rawType)) {
throw new MappingException(
String.format(INCOMPATIBLE_TYPES, source, source.getClass(), rawType, getPath()));
}
}
if (typeHint.isCollectionLike() || typeHint.getType().isAssignableFrom(Collection.class)) {
return (S) collectionConverter.convert(this, (Collection<?>) source, typeHint);
}
}
if (typeHint.isMap()) { // this "if branch" is applied
return (S) mapConverter.convert(this, (Bson) source, typeHint);
}
if (source instanceof DBRef) {
return (S) dbRefConverter.convert(this, (DBRef) source, typeHint);
}
if (source instanceof Collection) {
throw new MappingException(
String.format(INCOMPATIBLE_TYPES, source, BasicDBList.class, typeHint.getType(), getPath()));
}
if (source instanceof Bson) {
return (S) documentConverter.convert(this, (Bson) source, typeHint);
}
return (S) elementConverter.convert(source, typeHint);
}
I think, the right way is to let untouched inputs which should be treated (converted into) as org.bson.Document (suggested as typeHint param).
Can you please look at the problem? I believe I provided enough info, but if you need further details, do not hesitate to contact me.
Thank you for your cooperation.
Regards.