I have a structure similar to this:

class Cat {
  int id;
  List<Kitten> kittens;
}

class Kitten {
  int id;
}

I'd like to prevent users from creating a cat with more than one kitten with the same id. I've tried creating an index as follows:

db.Cats.ensureIndex({'id': 1, 'kittens.id': 1}, {unique:true})

But when I attempt to insert a badly-formatted cat, Mongo accepts it.

Am I missing something? can this even be done?

Solution 1

As far as I know, unique indexes only enforce uniqueness across different documents, so this would throw a duplicate key error:

db.cats.insert( { id: 123, kittens: [ { id: 456 } ] } )
db.cats.insert( { id: 123, kittens: [ { id: 456 } ] } )

But this is allowed:

db.cats.insert( { id: 123, kittens: [ { id: 456 }, { id: 456 } ] } )

I'm not sure if there's any way enforce the constraint you need at the Mongo level, maybe it's something you could check in the application logic when you insert of update?

Solution 2

Ensuring uniqueness of the individual values in an array field

In addition to the example above, there is a function in MongoDB to ensure that when you are adding a new object/value to an array field, that it will only perform the update if the value/object doesn't already exist.

So if you have a document that looks like this:

{ _id: 123, kittens: [456] }

This would be allowed:

db.cats.update({_id:123}, {$push: {kittens:456}})

resulting in

{ _id: 123, kittens: [456, 456] }

however using the $addToSet function (as opposed to $push) would check if the value already exists before adding it. So, starting with:

{ _id: 123, kittens: [456] }

then executing:

db.cats.update({_id:123}, {$addToSet: {kittens:456}})

Would not have any effect.

So, long story short, unique constraints don't validate uniqueness within the value items of an array field, just that two documents can't have identical values in the indexed fields.

Solution 3

There is an equivalent of insert with uniquness in array attribute. The following command essentially does insert while ensuring the uniqueness of kittens (upsert creates it for you if the object with 123 doesn't already exist).

db.cats.update(
  { id: 123 },
  { $addToSet: {kittens: { $each: [ 456, 456] }}, $set: {'otherfields': 'extraval', "field2": "value2"}},
  { upsert: true}
)

The resulting value of the object will be

{
    "id": 123,
    "kittens": [456],
    "otherfields": "extraval",
    "field2": "value2"
}

Solution 4

Well what seemed important here is ensuring that no more than an item should exist in a mongodb object array, with the same id or some other fields that is required to be treated uniquely. Then, a simple query like this will suffice for update, using $addToSet.

Forgive me I am not a mongo-shell expert, using Java Mongo Driver version 4.0.3

collection = database.getCollection("cat", Cat.class);
UpdateResult result = collection.updateOne(and(eq("Id", 1), nin("kittens.id", newKittenId)), addToSet("kittens", new Kitten("newKittenId")));

The query used here added an extra condition to the match query, which goes like; where cat.id is 1 and the newKittenId is not yet owned by any of the kittens that had previously been added. So if the id for the cat is found and no kitten has taken the new kittenId, the query goes ahead and update the cat's kittens by adding a new one. But if the newKittenId had been taken by one of the kittens, it simply returns updateresult with no count, and no modified field (nothing happens).

Note: This does not ensure unique constraints on the kitten.id, mongo DB does not support uniqueness on object arrays in a document, and addToSet does not really handle duplicate item in an object array, except the object is 100% a replica of what is in the database check here for more explanation about addToSet.

Solution 5

there is a workaround you can do using the document validator.

Here is an example validator where "a" is an array and within "a" subdocument field "b" value must be unique. This assumes the collection is either empty or already complies with the rule:

> db.runCommand({collMod:"coll", validator: {$expr:{$eq:[{$size:"$a.b"},{$size:{$setUnion:"$a.b"}}]}}})
/* test it */
> db.coll.insert({a:[{b:1}]}) /* success */
> db.coll.update({},{ '$push' : { 'a':{b:1}}})
WriteResult({
	"nMatched" : 0,
	"nUpserted" : 0,
	"nModified" : 0,
	"writeError" : {
		"code" : 121,
		"errmsg" : "Document failed validation"
	}
})

see more info about this solution from the original post

Solution 6

You can write a custom Mongoose validation method in this case. You can hook into post validation. Mongoose has validation and you can implement a hook before (pre) or after(post) validation. In this case, you can use post validation to see if the array is valid. Then just make sure the array has no duplications. There may be efficiency improvements you can make based upon your details. If you only have '_id' for example you could just use the JS includes function.

catSchema.post('validate',function(next) {
    return new Promise((resolve,reject) => {
        for(var i = 0; i < this.kittens.length; i++) {
           let kitten = this.kittens[i];
           for(var p = 0; p < this.kittens.length; p++) {
              if (p == i) {
                  continue;
              }
              if (kitten._id == this.kittens[p]._id) {
                  return reject('Duplicate Kitten Ids not allowed');
              }
           }
        }
        return resolve();
    });
});

I like to use promises in validation because it's easier to specify errors.