admin管理员组

文章数量:1202004

I'm fairly new to MongoDb / Mongoose, more used to SQL Server or Oracle.

I have a fairly simple Schema for an event.

EventSchema.add({
  pkey: { type: String, unique: true },
  device: { type: String, required: true },
  name: { type: String, required: true },
  owner: { type: String, required: true },
  description: { type: String, required: true },
});

I was looking at Mongoose Indexes which shows two ways of doing it, I used the field definition.

I also have a very simple API that accepts a POST and calls create on this collection to insert the record.

I wrote a test that checks that the insert of a record with the same pkey should not happen and that the unique:true is functioning. I already have a set of events that I read into an array so I just POST the first of these events again and see what happens, I expected that mongo DB would throw the E11000 duplicate key error, but this did not happen.

var url = 'api/events';
var evt = JSON.parse(JSON.stringify(events[0]));

// POST'ed new record won't have an _id yet 
delete evt._id;

api.post(url)
   .send(evt)
   .end(err, res) {
     err.should.exist; 
     err.code.should.equal(11000); 
   }); 

The test fails, there is no error and a duplicate record is inserted.

When I take a look at the collection I can see two records, both with the same pkey (the original record and the copy that I posted for the test). I do notice that the second record has the same creation date as the first but a later modified date.

(does mongo expect me to use the latest modified version record???, the URL is different and so is the ID)

[ { _id: 2,
    pkey: '6fea271282eb01467020ce70b5775319',
    name: 'Event name 01',
    owner: 'Test Owner',
    device: 'Device X',
    description: 'I have no idea what\'s happening',
    __v: 0,
    url: '/api/events/2',
    modified: '2016-03-23T07:31:18.529Z',
    created: '2016-03-23T07:31:18.470Z' },
  { _id: 1,
    pkey: '6fea271282eb01467020ce70b5775319',
    name: 'Event name 01',
    owner: 'Test Owner',
    device: 'Device X',
    description: 'I have no idea what\'s happening',
    __v: 0,
    url: '/api/events/1',
    modified: '2016-03-23T07:31:18.470Z',
    created: '2016-03-23T07:31:18.470Z' }
]

I had assumed that unique: true on the field definition told mongo db that this what you wanted and mongo enforced that for you at save, or maybe I just misunderstood something...

In SQL terms you create a key that can be used in URL lookup but you can build a unique compound index, to prevent duplicate inserts. I need to be able to define what fields in an event make the record unique because on a form data POST the submitter of a form does not have the next available _id value, but use the _id (done by "mongoose-auto-increment") so that the URL's use from other parts of the app are clean, like

/events/1

and not a complete mess of compound values, like

/events/Event%20name%2001%5fDevice%20X%5fTest%20Owner

I'm just about to start coding up the so for now I just wrote a simple test against this single string, but the real schema has a few more fields and will use a combination of them for uniqueness, I really want to get the initial test working before I start adding more tests, more fields and more code.

Is there something that I should be doing to ensure that the second record does not actually get inserted ?

I'm fairly new to MongoDb / Mongoose, more used to SQL Server or Oracle.

I have a fairly simple Schema for an event.

EventSchema.add({
  pkey: { type: String, unique: true },
  device: { type: String, required: true },
  name: { type: String, required: true },
  owner: { type: String, required: true },
  description: { type: String, required: true },
});

I was looking at Mongoose Indexes which shows two ways of doing it, I used the field definition.

I also have a very simple API that accepts a POST and calls create on this collection to insert the record.

I wrote a test that checks that the insert of a record with the same pkey should not happen and that the unique:true is functioning. I already have a set of events that I read into an array so I just POST the first of these events again and see what happens, I expected that mongo DB would throw the E11000 duplicate key error, but this did not happen.

var url = 'api/events';
var evt = JSON.parse(JSON.stringify(events[0]));

// POST'ed new record won't have an _id yet 
delete evt._id;

api.post(url)
   .send(evt)
   .end(err, res) {
     err.should.exist; 
     err.code.should.equal(11000); 
   }); 

The test fails, there is no error and a duplicate record is inserted.

When I take a look at the collection I can see two records, both with the same pkey (the original record and the copy that I posted for the test). I do notice that the second record has the same creation date as the first but a later modified date.

(does mongo expect me to use the latest modified version record???, the URL is different and so is the ID)

[ { _id: 2,
    pkey: '6fea271282eb01467020ce70b5775319',
    name: 'Event name 01',
    owner: 'Test Owner',
    device: 'Device X',
    description: 'I have no idea what\'s happening',
    __v: 0,
    url: '/api/events/2',
    modified: '2016-03-23T07:31:18.529Z',
    created: '2016-03-23T07:31:18.470Z' },
  { _id: 1,
    pkey: '6fea271282eb01467020ce70b5775319',
    name: 'Event name 01',
    owner: 'Test Owner',
    device: 'Device X',
    description: 'I have no idea what\'s happening',
    __v: 0,
    url: '/api/events/1',
    modified: '2016-03-23T07:31:18.470Z',
    created: '2016-03-23T07:31:18.470Z' }
]

I had assumed that unique: true on the field definition told mongo db that this what you wanted and mongo enforced that for you at save, or maybe I just misunderstood something...

In SQL terms you create a key that can be used in URL lookup but you can build a unique compound index, to prevent duplicate inserts. I need to be able to define what fields in an event make the record unique because on a form data POST the submitter of a form does not have the next available _id value, but use the _id (done by "mongoose-auto-increment") so that the URL's use from other parts of the app are clean, like

/events/1

and not a complete mess of compound values, like

/events/Event%20name%2001%5fDevice%20X%5fTest%20Owner

I'm just about to start coding up the so for now I just wrote a simple test against this single string, but the real schema has a few more fields and will use a combination of them for uniqueness, I really want to get the initial test working before I start adding more tests, more fields and more code.

Is there something that I should be doing to ensure that the second record does not actually get inserted ?

Share Improve this question edited Mar 23, 2016 at 9:12 Code Uniquely asked Mar 23, 2016 at 8:16 Code UniquelyCode Uniquely 6,3734 gold badges31 silver badges40 bronze badges 1
  • 1 go read some stuff, not that helpful ... , I included "mongoose-auto-increment" which apparently creates the _id and makes them go up by 1 on each save. _id may be unique and useful for /events/:id, but this is not what makes the record unique, this is determined by a compound combination of fields from the event (to be added). I wanted to see if it would work with a simpler test on one field of the schema before I started added multiple values to the index and coding the entire thing. A compound key value gives event/{:horible_key} in the URL – Code Uniquely Commented Mar 23, 2016 at 8:43
Add a comment  | 

5 Answers 5

Reset to default 6

It seems that you have done unique indexing(at schema level) after inserting some records in db.

please follow below steps to avoiding duplicates -

1) drop your db:

$ mongo

> use <db-name>;

> db.dropDatabase();

2) Now do indexing at schema level or db level

 var EventSchema = new mongoose.Schema({
      pkey: { type: String, unique: true },
      device: { type: String, required: true },
      name: { type: String, required: true },
      owner: { type: String, required: true },
      description: { type: String, required: true },
    });

It will avoid duplicate record insertion with same pKey value.

and for ensuring the index, use command db.db_name.getIndexes().

I hope it helps. thank you

OK it looks like it has something to do with the index not having time to update before the second insert is posted (as there is only 9ms between them in my test suite).

  • need to do something about inserts waiting for "index"
  • needs to be API side as not all users of the API are web applications

I also found some other SO articles about constraints:

mongoose unique: true not work

Unique index not working with Mongoose / MongoDB

MongoDB/Mongoose unique constraint on Date field

on mongoose.connect add {useCreateIndex: true}

It should look like this

mongoose.connect(uri, {
useNewUrlParser: true, 
useUnifiedTopology: true, 
useCreateIndex: true
})

add:

EventSchema.index({ pkey: 1 }, { unique: true });
// Rebuild all indexes
await User.syncIndexes();

worked for me. from https://masteringjs.io/tutorials/mongoose/unique

本文标签: javascriptPreventing duplicate records in MongooseStack Overflow