Skip to content

Add findByFields method #50

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 10 commits into from
May 5, 2021
39 changes: 38 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,10 +6,11 @@ Apollo [data source](https://www.apollographql.com/docs/apollo-server/features/d
npm i apollo-datasource-mongodb
```

This package uses [DataLoader](https://github.com/graphql/dataloader) for batching and per-request memoization caching. It also optionally (if you provide a `ttl`) does shared application-level caching (using either the default Apollo `InMemoryLRUCache` or the [cache you provide to ApolloServer()](https://www.apollographql.com/docs/apollo-server/features/data-sources#using-memcachedredis-as-a-cache-storage-backend)). It does this only for these two methods:
This package uses [DataLoader](https://github.com/graphql/dataloader) for batching and per-request memoization caching. It also optionally (if you provide a `ttl`) does shared application-level caching (using either the default Apollo `InMemoryLRUCache` or the [cache you provide to ApolloServer()](https://www.apollographql.com/docs/apollo-server/features/data-sources#using-memcachedredis-as-a-cache-storage-backend)). It does this for the following methods:

- [`findOneById(id, options)`](#findonebyid)
- [`findManyByIds(ids, options)`](#findmanybyids)
- [`findByFields(fields, options)`](#findbyfields)


<!-- START doctoc generated TOC please keep comment here to allow auto update -->
Expand All @@ -24,6 +25,8 @@ This package uses [DataLoader](https://github.com/graphql/dataloader) for batchi
- [API](#api)
- [findOneById](#findonebyid)
- [findManyByIds](#findmanybyids)
- [findByFields](#findbyfields)
- [Examples:](#examples)
- [deleteFromCacheById](#deletefromcachebyid)

<!-- END doctoc generated TOC please keep comment here to allow auto update -->
Expand Down Expand Up @@ -183,6 +186,7 @@ interface UserDocument {
username: string
password: string
email: string
interests: [string]
}

// This is optional
Expand Down Expand Up @@ -236,6 +240,39 @@ Resolves to the found document. Uses DataLoader to load `id`. DataLoader uses `c

Calls [`findOneById()`](#findonebyid) for each id. Resolves to an array of documents.

### findByFields

`this.findByFields(fields, { ttl })`

Resolves to an array of documents matching the passed fields.

fields has type `{ [fieldName: string]: string | number | boolean | [string | number | boolean] }`.

#### Examples:

```js

// get user by username
// `collection.find({ username: $in: ['testUser'] })`
this.getByFields({
username: 'testUser'
})

// get all users with either the "gaming" OR "games" interest
// `collection.find({ interests: $in: ['gaming', 'games'] })`
this.getByFields({
interests: ['gaming', 'games']
})

// get user by username AND with either the "gaming" OR "games" interest
// `collection.find({ username: $in: ['testUser'], interests: $in: ['gaming', 'games'] })`
this.getByFields({
username: 'testUser',
interests: ['gaming', 'games']
})

```

### deleteFromCacheById

`this.deleteFromCacheById(id)`
Expand Down
11 changes: 10 additions & 1 deletion index.d.ts
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,10 @@ declare module 'apollo-datasource-mongodb' {
export type ModelOrCollection<T> = T extends Document
? Model<T>
: Collection<T>

export interface Fields {
[fieldName: string]: string | number | boolean | [string | number | boolean]
}

export interface Options {
ttl: number
Expand All @@ -40,6 +44,11 @@ declare module 'apollo-datasource-mongodb' {
options?: Options
): Promise<(TData | null | undefined)[]>

findByFields(
fields: Fields,
options?: Options
): Promise<(TData | null | undefined)[]>

deleteFromCacheById(id: ObjectId | string): Promise<void>
}
}
}
189 changes: 149 additions & 40 deletions src/__tests__/cache.test.js
Original file line number Diff line number Diff line change
Expand Up @@ -9,19 +9,36 @@ const hexId = 'aaaa0000bbbb0000cccc0000'

const docs = {
one: {
_id: ObjectId(hexId)
_id: ObjectId(hexId),
foo: 'bar',
tags: ['foo', 'bar']
},
two: {
_id: ObjectId()
_id: ObjectId(),
foo: 'bar'
}
}

const stringDoc = {
_id: 's2QBCnv6fXv5YbjAP'
_id: 's2QBCnv6fXv5YbjAP',
tags: ['bar', 'baz']
}

const collectionName = 'test'
const cacheKey = id => `mongo-${collectionName}-${idToString(id)}`
const cacheKeyById = id => `mongo-${collectionName}-${idToString(id)}`
const cacheKeyByFields = fields => {
const cleanedFields = {}

Object.keys(fields).forEach(key => {
if (typeof key !== 'undefined') {
cleanedFields[key] = Array.isArray(fields[key])
? fields[key]
: [fields[key]]
}
})

return `mongo-${collectionName}-${JSON.stringify(cleanedFields)}`
}

describe('createCachingMethods', () => {
let collection
Expand All @@ -31,30 +48,55 @@ describe('createCachingMethods', () => {
beforeEach(() => {
collection = {
collectionName,
find: jest.fn(({ _id: { $in: ids } }) => ({
toArray: () =>
new Promise(resolve => {
setTimeout(
() =>
resolve(
ids.map(id => {
if (id === stringDoc._id) {
return stringDoc
}

if (id.equals(docs.one._id)) {
return docs.one
}

if (id.equals(docs.two._id)) {
return docs.two
}
})
),
0
)
})
}))
find: jest.fn(filter => {
return {
toArray: () =>
new Promise(resolve => {
setTimeout(
() =>
resolve(
[docs.one, docs.two, stringDoc].filter(doc => {
for (const orFilter of filter.$or || [filter]) {
for (const field in orFilter) {
if (field === '_id') {
for (const id of orFilter._id.$in) {
if (id.equals && !id.equals(doc._id)) {
break
} else if (
doc._id.equals &&
!doc._id.equals(id)
) {
break
} else if (
!id.equals &&
!doc._id.equals &&
id !== doc._id
) {
break
}
}
} else if (Array.isArray(doc[field])) {
for (const value of orFilter[field].$in) {
if (!doc[field].includes(value)) {
break
}
}
} else if (
!orFilter[field].$in.includes(doc[field])
) {
break
}
}
return true
}
return false
})
),
0
)
})
}
})
}

cache = new InMemoryLRUCache()
Expand All @@ -65,10 +107,11 @@ describe('createCachingMethods', () => {
it('adds the right methods', () => {
expect(api.findOneById).toBeDefined()
expect(api.findManyByIds).toBeDefined()
expect(api.findByFields).toBeDefined()
expect(api.deleteFromCacheById).toBeDefined()
})

it('finds one', async () => {
it('finds one with ObjectId', async () => {
const doc = await api.findOneById(docs.one._id)
expect(doc).toBe(docs.one)
expect(collection.find.mock.calls.length).toBe(1)
Expand All @@ -89,48 +132,114 @@ describe('createCachingMethods', () => {
expect(collection.find.mock.calls.length).toBe(1)
})

// TODO why doesn't this pass?
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd like this to remain in the file until solved

Copy link
Contributor Author

@ryangoree ryangoree Mar 12, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

.find() is only called once because the dataloader saves the result from the first call. The second call doesn't find anything in the InMemoryLRUCache since the ttl option wasn't passed in the first call, but it ends up calling the dataloader's .load() again with the same key as the first call which returns from memoization cache rather than calling .find() again.

// it.only(`doesn't cache without ttl`, async () => {
// await api.findOneById(docs.id1._id)
// await api.findOneById(docs.id1._id)
it('finds by field', async () => {
const foundDocs = await api.findByFields({ foo: 'bar' })

expect(foundDocs[0]).toBe(docs.one)
expect(foundDocs[1]).toBe(docs.two)
expect(foundDocs.length).toBe(2)

expect(collection.find.mock.calls.length).toBe(1)
})

it('finds by array field', async () => {
const foundDocs = await api.findByFields({ tags: 'bar' })

expect(foundDocs[0]).toBe(docs.one)
expect(foundDocs[1]).toBe(stringDoc)
expect(foundDocs.length).toBe(2)

expect(collection.find.mock.calls.length).toBe(1)
})

it('finds by mutiple fields', async () => {
const foundDocs = await api.findByFields({
tags: ['foo', 'bar'],
foo: 'bar'
})

expect(foundDocs[0]).toBe(docs.one)
expect(foundDocs.length).toBe(1)

expect(collection.find.mock.calls.length).toBe(1)
})

it(`doesn't mix filters of pending calls for different fields`, async () => {
const pendingDocs1 = api.findByFields({ foo: 'bar' })
const pendingDocs2 = api.findByFields({ tags: 'baz' })
const [foundDocs1, foundDocs2] = await Promise.all([
pendingDocs1,
pendingDocs2
])

expect(foundDocs1[0]).toBe(docs.one)
expect(foundDocs1[1]).toBe(docs.two)
expect(foundDocs1.length).toBe(2)
expect(foundDocs2[0]).toBe(stringDoc)
expect(foundDocs2.length).toBe(1)

expect(collection.find.mock.calls.length).toBe(1)
})

it(`dataloader caches each value individually when finding by a single field`, async () => {
await api.findByFields({ tags: ['foo', 'baz'] }, { ttl: 1 })

const fooBazCacheValue = await cache.get(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

cache is separate from dataloader. i suggest removing "dataloader" from the test title and creating other tests for dataloader (which should do memoization caching: https://github.com/graphql/dataloader#caching)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is testing the memoization caching. The desired behavior is that if findByFields is used with a single field and multiple values, each individual value should get cached by the dataloaders memoization cache (tested by calling with 2 values, 'foo' and 'baz', on line 185, then calling with a single value, 'foo', on line 195, then expecting find to have been called only once on line 196). However, the InMemoryLRUCache cache should not be affected by this (tested by ensuring the cache has a result for both values on line 192, but not for a single value, on line 193)

cacheKeyByFields({ tags: ['foo', 'baz'] })
)
const fooCacheValue = await cache.get(cacheKeyByFields({ tags: 'foo' }))

expect(fooBazCacheValue).toBeDefined()
expect(fooCacheValue).toBeUndefined()

// expect(collection.find.mock.calls.length).toBe(2)
// })
await api.findByFields({ tags: 'foo' })
expect(collection.find.mock.calls.length).toBe(1)
})

it(`doesn't cache without ttl`, async () => {
await api.findOneById(docs.one._id)

const value = await cache.get(cacheKey(docs.one._id))
const value = await cache.get(cacheKeyById(docs.one._id))
expect(value).toBeUndefined()
})

it(`caches`, async () => {
await api.findOneById(docs.one._id, { ttl: 1 })
const value = await cache.get(cacheKey(docs.one._id))
const value = await cache.get(cacheKeyById(docs.one._id))
expect(value).toEqual(EJSON.stringify(docs.one))

await api.findOneById(docs.one._id)
expect(collection.find.mock.calls.length).toBe(1)
})

it(`caches non-array field values as arrays`, async () => {
const fields = { tags: 'foo' }
await api.findByFields(fields, { ttl: 1 })
const value = await cache.get(cacheKeyByFields(fields))
expect(value).toEqual(EJSON.stringify([docs.one]))

await api.findByFields({ tags: ['foo'] })
expect(collection.find.mock.calls.length).toBe(1)
})

it(`caches with ttl`, async () => {
await api.findOneById(docs.one._id, { ttl: 1 })
await wait(1001)

const value = await cache.get(cacheKey(docs.one._id))
const value = await cache.get(cacheKeyById(docs.one._id))
expect(value).toBeUndefined()
})

it(`deletes from cache`, async () => {
for (const doc of [docs.one, docs.two, stringDoc]) {
await api.findOneById(doc._id, { ttl: 1 })

const valueBefore = await cache.get(cacheKey(doc._id))
const valueBefore = await cache.get(cacheKeyById(doc._id))
expect(valueBefore).toEqual(EJSON.stringify(doc))

await api.deleteFromCacheById(doc._id)

const valueAfter = await cache.get(cacheKey(doc._id))
const valueAfter = await cache.get(cacheKeyById(doc._id))
expect(valueAfter).toBeUndefined()
}
})
Expand Down
Loading